sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
a9c0fe1ad6f2f05f4ff07d1f134f412a354f64a5 |
Retail Car Dealership Data
_____
Data for a car delearship. Perform EDA extract features and clean it up. Source Kaggle.
Try it out! It's primary goal is to provide an interface for users to download the dataset and try it out. | cs-uche/car_dealership | [
"task_categories:feature-extraction",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"retail",
"car",
"region:us"
] | 2024-02-02T16:37:38+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["feature-extraction"], "pretty_name": "Car Dealership", "tags": ["retail", "car"]} | 2024-02-02T17:12:35+00:00 | [] | [
"en"
] | TAGS
#task_categories-feature-extraction #size_categories-1M<n<10M #language-English #license-apache-2.0 #retail #car #region-us
|
Retail Car Dealership Data
_____
Data for a car delearship. Perform EDA extract features and clean it up. Source Kaggle.
Try it out! It's primary goal is to provide an interface for users to download the dataset and try it out. | [] | [
"TAGS\n#task_categories-feature-extraction #size_categories-1M<n<10M #language-English #license-apache-2.0 #retail #car #region-us \n"
] |
ccccbd8ee1f44a5276483bbb60e4ea32558587dc |
# Dataset Card for Evaluation run of BFauber/santa1.1b_10e6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/santa1.1b_10e6](https://huggingface.co/BFauber/santa1.1b_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__santa1.1b_10e6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T16:45:17.902394](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__santa1.1b_10e6/blob/main/results_2024-02-02T16-45-17.902394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.253990042460175,
"acc_stderr": 0.03095765578044554,
"acc_norm": 0.25469640757519596,
"acc_norm_stderr": 0.03177695985490728,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.49395170292668467,
"mc2_stderr": 0.01700497674517879
},
"harness|arc:challenge|25": {
"acc": 0.24914675767918087,
"acc_stderr": 0.012639407111926435,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252425
},
"harness|hellaswag|10": {
"acc": 0.2555267874925314,
"acc_stderr": 0.004352655263682343,
"acc_norm": 0.2638916550487951,
"acc_norm_stderr": 0.004398404992933846
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403325,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403325
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501708,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501708
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.02713634960242406,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.02713634960242406
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.037528339580033376,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.037528339580033376
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02201908001221789,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02201908001221789
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1967741935483871,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.1967741935483871,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22279792746113988,
"acc_stderr": 0.030031147977641545,
"acc_norm": 0.22279792746113988,
"acc_norm_stderr": 0.030031147977641545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2815126050420168,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.2815126050420168,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437295,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02723601394619669,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02723601394619669
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21455938697318008,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.21455938697318008,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803838,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803838
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527836,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22186495176848875,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.22186495176848875,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290406,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290406
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165441,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165441
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22448979591836735,
"acc_stderr": 0.02671143055553841,
"acc_norm": 0.22448979591836735,
"acc_norm_stderr": 0.02671143055553841
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570345,
"mc2": 0.49395170292668467,
"mc2_stderr": 0.01700497674517879
},
"harness|winogrande|5": {
"acc": 0.5019731649565904,
"acc_stderr": 0.014052376259225643
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__santa1.1b_10e6 | [
"region:us"
] | 2024-02-02T16:47:11+00:00 | {"pretty_name": "Evaluation run of BFauber/santa1.1b_10e6", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/santa1.1b_10e6](https://huggingface.co/BFauber/santa1.1b_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__santa1.1b_10e6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T16:45:17.902394](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__santa1.1b_10e6/blob/main/results_2024-02-02T16-45-17.902394.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.253990042460175,\n \"acc_stderr\": 0.03095765578044554,\n \"acc_norm\": 0.25469640757519596,\n \"acc_norm_stderr\": 0.03177695985490728,\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.49395170292668467,\n \"mc2_stderr\": 0.01700497674517879\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.24914675767918087,\n \"acc_stderr\": 0.012639407111926435,\n \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252425\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2555267874925314,\n \"acc_stderr\": 0.004352655263682343,\n \"acc_norm\": 0.2638916550487951,\n \"acc_norm_stderr\": 0.004398404992933846\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403325,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403325\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501708,\n \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501708\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.02713634960242406,\n \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.02713634960242406\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.037528339580033376,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.037528339580033376\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02201908001221789,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02201908001221789\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1967741935483871,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.1967741935483871,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22279792746113988,\n \"acc_stderr\": 0.030031147977641545,\n \"acc_norm\": 0.22279792746113988,\n \"acc_norm_stderr\": 0.030031147977641545\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2815126050420168,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.2815126050420168,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437295,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695066,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695066\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.20179372197309417,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02723601394619669,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02723601394619669\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21455938697318008,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.21455938697318008,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803838,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803838\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527836,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527836\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22186495176848875,\n \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.22186495176848875,\n \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.025557653981868055,\n \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.025557653981868055\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290406,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290406\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n \"acc_stderr\": 0.011064151027165441,\n \"acc_norm\": 0.2503259452411995,\n \"acc_norm_stderr\": 0.011064151027165441\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625166,\n \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625166\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22448979591836735,\n \"acc_stderr\": 0.02671143055553841,\n \"acc_norm\": 0.22448979591836735,\n \"acc_norm_stderr\": 0.02671143055553841\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n \"mc1_stderr\": 0.015000674373570345,\n \"mc2\": 0.49395170292668467,\n \"mc2_stderr\": 0.01700497674517879\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5019731649565904,\n \"acc_stderr\": 0.014052376259225643\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/santa1.1b_10e6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|arc:challenge|25_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|gsm8k|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hellaswag|10_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T16-45-17.902394.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["**/details_harness|winogrande|5_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T16-45-17.902394.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T16_45_17.902394", "path": ["results_2024-02-02T16-45-17.902394.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T16-45-17.902394.parquet"]}]}]} | 2024-02-02T16:47:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/santa1.1b_10e6
Dataset automatically created during the evaluation run of model BFauber/santa1.1b_10e6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T16:45:17.902394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/santa1.1b_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/santa1.1b_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T16:45:17.902394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/santa1.1b_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/santa1.1b_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T16:45:17.902394(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d8ccd873875af8c3d84b8713ffde59241380f6b9 |
# Dataset Card for Evaluation run of Sharathhebbar24/Med_GPT2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sharathhebbar24/Med_GPT2](https://huggingface.co/Sharathhebbar24/Med_GPT2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sharathhebbar24__Med_GPT2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T16:52:19.552836](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__Med_GPT2/blob/main/results_2024-02-02T16-52-19.552836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24043940952705942,
"acc_stderr": 0.030117491673403593,
"acc_norm": 0.24108796756918996,
"acc_norm_stderr": 0.030869772797897203,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3895253694669724,
"mc2_stderr": 0.014938286995541047
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675816,
"acc_norm": 0.23378839590443687,
"acc_norm_stderr": 0.012368225378507128
},
"harness|hellaswag|10": {
"acc": 0.2898824935271858,
"acc_stderr": 0.004527804016253779,
"acc_norm": 0.30989842660824535,
"acc_norm_stderr": 0.004615063817741858
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.18867924528301888,
"acc_stderr": 0.02407999513006224,
"acc_norm": 0.18867924528301888,
"acc_norm_stderr": 0.02407999513006224
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.12698412698412698,
"acc_stderr": 0.029780417522688438,
"acc_norm": 0.12698412698412698,
"acc_norm_stderr": 0.029780417522688438
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.14,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.024472243840895525,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.024472243840895525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444455,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444455
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877793,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877793
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463196,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463196
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.024720713193952144,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.024720713193952144
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.02820554503327773,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.02820554503327773
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.031732843842942844,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.031732843842942844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.03642914578292404,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.03642914578292404
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935434,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935434
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25798212005108556,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.25798212005108556,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180848,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180848
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.16363636363636364,
"acc_stderr": 0.03543433054298678,
"acc_norm": 0.16363636363636364,
"acc_norm_stderr": 0.03543433054298678
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629919,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629919
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041843,
"mc2": 0.3895253694669724,
"mc2_stderr": 0.014938286995541047
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616441
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.0028227133223877035
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sharathhebbar24__Med_GPT2 | [
"region:us"
] | 2024-02-02T16:53:40+00:00 | {"pretty_name": "Evaluation run of Sharathhebbar24/Med_GPT2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sharathhebbar24/Med_GPT2](https://huggingface.co/Sharathhebbar24/Med_GPT2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sharathhebbar24__Med_GPT2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T16:52:19.552836](https://huggingface.co/datasets/open-llm-leaderboard/details_Sharathhebbar24__Med_GPT2/blob/main/results_2024-02-02T16-52-19.552836.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24043940952705942,\n \"acc_stderr\": 0.030117491673403593,\n \"acc_norm\": 0.24108796756918996,\n \"acc_norm_stderr\": 0.030869772797897203,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3895253694669724,\n \"mc2_stderr\": 0.014938286995541047\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675816,\n \"acc_norm\": 0.23378839590443687,\n \"acc_norm_stderr\": 0.012368225378507128\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2898824935271858,\n \"acc_stderr\": 0.004527804016253779,\n \"acc_norm\": 0.30989842660824535,\n \"acc_norm_stderr\": 0.004615063817741858\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.18867924528301888,\n \"acc_stderr\": 0.02407999513006224,\n \"acc_norm\": 0.18867924528301888,\n \"acc_norm_stderr\": 0.02407999513006224\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.12698412698412698,\n \"acc_stderr\": 0.029780417522688438,\n \"acc_norm\": 0.12698412698412698,\n \"acc_norm_stderr\": 0.029780417522688438\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.024472243840895525,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.024472243840895525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444455,\n \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444455\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.23737373737373738,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877793,\n \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877793\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463196,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463196\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952144,\n \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952144\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.02820554503327773,\n \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.02820554503327773\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.18543046357615894,\n \"acc_stderr\": 0.031732843842942844,\n \"acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.031732843842942844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.03642914578292404,\n \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.03642914578292404\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n \"acc_stderr\": 0.027778835904935434,\n \"acc_norm\": 0.23504273504273504,\n \"acc_norm_stderr\": 0.027778835904935434\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25798212005108556,\n \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.25798212005108556,\n \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180848,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180848\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.16363636363636364,\n \"acc_stderr\": 0.03543433054298678,\n \"acc_norm\": 0.16363636363636364,\n \"acc_norm_stderr\": 0.03543433054298678\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.03175554786629919,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.03175554786629919\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.014896277441041843,\n \"mc2\": 0.3895253694669724,\n \"mc2_stderr\": 0.014938286995541047\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616441\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.0028227133223877035\n }\n}\n```", "repo_url": "https://huggingface.co/Sharathhebbar24/Med_GPT2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|arc:challenge|25_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|gsm8k|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hellaswag|10_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T16-52-19.552836.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["**/details_harness|winogrande|5_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T16-52-19.552836.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T16_52_19.552836", "path": ["results_2024-02-02T16-52-19.552836.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T16-52-19.552836.parquet"]}]}]} | 2024-02-02T16:54:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sharathhebbar24/Med_GPT2
Dataset automatically created during the evaluation run of model Sharathhebbar24/Med_GPT2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T16:52:19.552836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sharathhebbar24/Med_GPT2\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/Med_GPT2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T16:52:19.552836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sharathhebbar24/Med_GPT2\n\n\n\nDataset automatically created during the evaluation run of model Sharathhebbar24/Med_GPT2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T16:52:19.552836(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f4a752a439f9d5c6fa8c3f84118a48a68abaf057 |
## Description
This is a fair-use dataset of research-only experiments, which are not sold.
The visuals and sounds have been re-generated from scratch using AI.
## Model
SVD
# Tags
- Movie | jbilcke/ai-tube-cinema | [
"license:cc-by-nc-4.0",
"region:us"
] | 2024-02-02T16:55:49+00:00 | {"license": "cc-by-nc-4.0", "pretty_name": "AI Cinema"} | 2024-02-07T22:18:47+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #region-us
|
## Description
This is a fair-use dataset of research-only experiments, which are not sold.
The visuals and sounds have been re-generated from scratch using AI.
## Model
SVD
# Tags
- Movie | [
"## Description\n\nThis is a fair-use dataset of research-only experiments, which are not sold.\nThe visuals and sounds have been re-generated from scratch using AI.",
"## Model\n\nSVD",
"# Tags\n\n- Movie"
] | [
"TAGS\n#license-cc-by-nc-4.0 #region-us \n",
"## Description\n\nThis is a fair-use dataset of research-only experiments, which are not sold.\nThe visuals and sounds have been re-generated from scratch using AI.",
"## Model\n\nSVD",
"# Tags\n\n- Movie"
] |
8ebfe6a5ba5f898b3185c1852bd2ae45c4f8af94 |
# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv0.9
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YKM11/Mistral-7B-adaptv0.9](https://huggingface.co/YKM11/Mistral-7B-adaptv0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T17:00:46.395617](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9/blob/main/results_2024-02-02T17-00-46.395617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6534693218977564,
"acc_stderr": 0.03204118105946018,
"acc_norm": 0.652862155312837,
"acc_norm_stderr": 0.03271329249335617,
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7311817247902683,
"mc2_stderr": 0.014597852035553836
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520767,
"acc_norm": 0.735494880546075,
"acc_norm_stderr": 0.012889272949313368
},
"harness|hellaswag|10": {
"acc": 0.722266480780721,
"acc_stderr": 0.004469659042824775,
"acc_norm": 0.8895638319059949,
"acc_norm_stderr": 0.0031279207383941086
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473086,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461766,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461766
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.01274724896707907,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.01274724896707907
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5887392900856793,
"mc1_stderr": 0.017225627083660874,
"mc2": 0.7311817247902683,
"mc2_stderr": 0.014597852035553836
},
"harness|winogrande|5": {
"acc": 0.856353591160221,
"acc_stderr": 0.009857280052696737
},
"harness|gsm8k|5": {
"acc": 0.6793025018953753,
"acc_stderr": 0.012856468433722283
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9 | [
"region:us"
] | 2024-02-02T17:03:07+00:00 | {"pretty_name": "Evaluation run of YKM11/Mistral-7B-adaptv0.9", "dataset_summary": "Dataset automatically created during the evaluation run of model [YKM11/Mistral-7B-adaptv0.9](https://huggingface.co/YKM11/Mistral-7B-adaptv0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T17:00:46.395617](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv0.9/blob/main/results_2024-02-02T17-00-46.395617.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6534693218977564,\n \"acc_stderr\": 0.03204118105946018,\n \"acc_norm\": 0.652862155312837,\n \"acc_norm_stderr\": 0.03271329249335617,\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7311817247902683,\n \"mc2_stderr\": 0.014597852035553836\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520767,\n \"acc_norm\": 0.735494880546075,\n \"acc_norm_stderr\": 0.012889272949313368\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.722266480780721,\n \"acc_stderr\": 0.004469659042824775,\n \"acc_norm\": 0.8895638319059949,\n \"acc_norm_stderr\": 0.0031279207383941086\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473086,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473086\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461766,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461766\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n \"acc_stderr\": 0.01274724896707907,\n \"acc_norm\": 0.470013037809648,\n \"acc_norm_stderr\": 0.01274724896707907\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5887392900856793,\n \"mc1_stderr\": 0.017225627083660874,\n \"mc2\": 0.7311817247902683,\n \"mc2_stderr\": 0.014597852035553836\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.856353591160221,\n \"acc_stderr\": 0.009857280052696737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6793025018953753,\n \"acc_stderr\": 0.012856468433722283\n }\n}\n```", "repo_url": "https://huggingface.co/YKM11/Mistral-7B-adaptv0.9", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|arc:challenge|25_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|gsm8k|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hellaswag|10_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["**/details_harness|winogrande|5_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T17-00-46.395617.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T17_00_46.395617", "path": ["results_2024-02-02T17-00-46.395617.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T17-00-46.395617.parquet"]}]}]} | 2024-02-02T17:03:32+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv0.9
Dataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv0.9 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T17:00:46.395617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv0.9\n\n\n\nDataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv0.9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T17:00:46.395617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv0.9\n\n\n\nDataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv0.9 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T17:00:46.395617(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e3814d1a1825224d25ec5dd08d1a88f5fa493404 | # Dataset Card for [MultiCCAligned-TW-Corpus]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [Heng-Shiou Sheu](mailto:[email protected])
### Dataset Summary
MultiCCAligned-TW-Corpus 是一個機器翻譯基準的多語言資料集,源自 [OPUS](https://opus.nlpl.eu/MultiCCAligned/zh-TW&th/v1.1/MultiCCAligned) 收集的使用者貢獻的翻譯,並由 [OPUS](https://opus.nlpl.eu/)。該資料集包括按語言對排序的測試和開發資料。它包括數百種語言對的測試集,並且不斷更新。請檢查版本號標籤以引用您正在使用的版本。
### Supported Tasks and Leaderboards
### Languages
此資料集涵蓋數百種語言和語言對,並按 ISO-639-1 語言組織。目前版本涵蓋以下語言。繁體中文、英文、日文、韓文、印尼文、越南文、泰文
## Dataset Structure
### Data Instances
資料以 , 分隔檔案中內容,具有三個欄位:指示、輸入和輸出。請注意,我們並不暗示平移方向,並認為資料集是對稱的並用作兩個方向的測試集。
### Data Splits
先整理出 Train 資料。
## Dataset Creation
### Curation Rationale
本資料集將持續更新,未來將公開發佈於 Github 當中。高語言覆蓋率是本計畫的主要目標,資料集的準備與標準化語言標籤和分發格式保持一致和系統化。
### Source Data
#### Initial Data Collection and Normalization
MultiCCAligned 資料集是從根據 68 個 Commoncrawl 快照創建的(截至 2020 年 3 月)。根據標點符號將文件分割成句子,並執行重複資料刪除。語料庫的準備工作並沒有提出任何智慧財產權主張。原始發行版可從 http://www.statmt.org/cc-aligned/ 取得
#### Who are the source language producers?
這些轉錄本已由 [EMNLP'20 作者群](https://www.aclweb.org/anthology/2020.emnlp-main.480.pdf)製作。
### Personal and Sensitive Information
有關處理個人資訊和敏感資訊的信息,我們請諮詢資料的[原始提供者](https://www.aclweb.org/anthology/2020.emnlp-main.480.pdf)。該資料集未經過任何方式處理以檢測或刪除潛在的敏感資訊或個人資訊。
### Social Impact of Dataset
語言覆蓋率很高,因此它代表了機器翻譯開發的非常有價值的資源,特別是對於資源較少的語言和語言對。不斷成長的資料庫也代表著一種動態資源,其價值將進一步成長。
### Other Known Limitations
這些句子通常很短,因此很容易翻譯。對於高資源語言,這會導致結果不如更具挑戰性的基準有用。對於資源較少的語言對來說,即使在非常具有挑戰性的設定中,範例的有限複雜性實際上也是衡量進度的一件好事。
### Dataset Curators
此資料集由Heng-Shiou Sheu 製作。
### Licensing Information
這些資料集沒使用 License.
### Citation Information
```
@inproceedings{Heng666/MultiCCAligned-TW-Corpus,
title={Taiwanese Phrases Multilingual Translation Dataset from MultiCCAligned Talks},
author={Heng-Shiou Sheu},
year={2024},
url={https://huggingface.co/datasets/Heng666/MultiCCAligned-TW-Corpus},
}
``` | Heng666/MultiCCAligned-TW-Corpus | [
"task_categories:translation",
"size_categories:1M<n<10M",
"language:tw",
"language:en",
"language:ja",
"language:ko",
"language:id",
"language:vi",
"language:th",
"license:mit",
"MultiCCAligned",
"translation",
"OPUS",
"region:us"
] | 2024-02-02T17:14:20+00:00 | {"language": ["tw", "en", "ja", "ko", "id", "vi", "th"], "license": "mit", "size_categories": ["1M<n<10M"], "task_categories": ["translation"], "pretty_name": "MultiCCAligned-TW-Corpus", "dataset_info": [{"config_name": "en-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 915753306, "num_examples": 4431807}], "download_size": 427856849, "dataset_size": 915753306}, {"config_name": "id-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 40825076, "num_examples": 163854}], "download_size": 15145811, "dataset_size": 40825076}, {"config_name": "ja-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 183580572, "num_examples": 741236}], "download_size": 73106413, "dataset_size": 183580572}, {"config_name": "ko-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 66408562, "num_examples": 333554}], "download_size": 27023874, "dataset_size": 66408562}, {"config_name": "th-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 84855507, "num_examples": 302186}], "download_size": 30150398, "dataset_size": 84855507}, {"config_name": "vi-zh_TW", "features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 54393256, "num_examples": 237025}], "download_size": 19441175, "dataset_size": 54393256}], "configs": [{"config_name": "en-zh_TW", "data_files": [{"split": "train", "path": "en-zh_TW/train-*"}]}, {"config_name": "id-zh_TW", "data_files": [{"split": "train", "path": "id-zh_TW/train-*"}]}, {"config_name": "ja-zh_TW", "data_files": [{"split": "train", "path": "ja-zh_TW/train-*"}]}, {"config_name": "ko-zh_TW", "data_files": [{"split": "train", "path": "ko-zh_TW/train-*"}]}, {"config_name": "th-zh_TW", "data_files": [{"split": "train", "path": "th-zh_TW/train-*"}]}, {"config_name": "vi-zh_TW", "data_files": [{"split": "train", "path": "vi-zh_TW/train-*"}]}], "tags": ["MultiCCAligned", "translation", "OPUS"]} | 2024-02-02T17:30:35+00:00 | [] | [
"tw",
"en",
"ja",
"ko",
"id",
"vi",
"th"
] | TAGS
#task_categories-translation #size_categories-1M<n<10M #language-Twi #language-English #language-Japanese #language-Korean #language-Indonesian #language-Vietnamese #language-Thai #license-mit #MultiCCAligned #translation #OPUS #region-us
| # Dataset Card for [MultiCCAligned-TW-Corpus]
## Table of Contents
- Table of Contents
- Dataset Description
- Dataset Summary
- Supported Tasks and Leaderboards
- Languages
- Dataset Structure
- Data Instances
- Data Fields
- Data Splits
- Dataset Creation
- Curation Rationale
- Source Data
- Annotations
- Personal and Sensitive Information
- Considerations for Using the Data
- Social Impact of Dataset
- Discussion of Biases
- Other Known Limitations
- Additional Information
- Dataset Curators
- Licensing Information
- Citation Information
- Contributions
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact: Heng-Shiou Sheu
### Dataset Summary
MultiCCAligned-TW-Corpus 是一個機器翻譯基準的多語言資料集,源自 OPUS 收集的使用者貢獻的翻譯,並由 OPUS。該資料集包括按語言對排序的測試和開發資料。它包括數百種語言對的測試集,並且不斷更新。請檢查版本號標籤以引用您正在使用的版本。
### Supported Tasks and Leaderboards
### Languages
此資料集涵蓋數百種語言和語言對,並按 ISO-639-1 語言組織。目前版本涵蓋以下語言。繁體中文、英文、日文、韓文、印尼文、越南文、泰文
## Dataset Structure
### Data Instances
資料以 , 分隔檔案中內容,具有三個欄位:指示、輸入和輸出。請注意,我們並不暗示平移方向,並認為資料集是對稱的並用作兩個方向的測試集。
### Data Splits
先整理出 Train 資料。
## Dataset Creation
### Curation Rationale
本資料集將持續更新,未來將公開發佈於 Github 當中。高語言覆蓋率是本計畫的主要目標,資料集的準備與標準化語言標籤和分發格式保持一致和系統化。
### Source Data
#### Initial Data Collection and Normalization
MultiCCAligned 資料集是從根據 68 個 Commoncrawl 快照創建的(截至 2020 年 3 月)。根據標點符號將文件分割成句子,並執行重複資料刪除。語料庫的準備工作並沒有提出任何智慧財產權主張。原始發行版可從 URL 取得
#### Who are the source language producers?
這些轉錄本已由 EMNLP'20 作者群製作。
### Personal and Sensitive Information
有關處理個人資訊和敏感資訊的信息,我們請諮詢資料的原始提供者。該資料集未經過任何方式處理以檢測或刪除潛在的敏感資訊或個人資訊。
### Social Impact of Dataset
語言覆蓋率很高,因此它代表了機器翻譯開發的非常有價值的資源,特別是對於資源較少的語言和語言對。不斷成長的資料庫也代表著一種動態資源,其價值將進一步成長。
### Other Known Limitations
這些句子通常很短,因此很容易翻譯。對於高資源語言,這會導致結果不如更具挑戰性的基準有用。對於資源較少的語言對來說,即使在非常具有挑戰性的設定中,範例的有限複雜性實際上也是衡量進度的一件好事。
### Dataset Curators
此資料集由Heng-Shiou Sheu 製作。
### Licensing Information
這些資料集沒使用 License.
| [
"# Dataset Card for [MultiCCAligned-TW-Corpus]",
"## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard:\n- Point of Contact: Heng-Shiou Sheu",
"### Dataset Summary\nMultiCCAligned-TW-Corpus 是一個機器翻譯基準的多語言資料集,源自 OPUS 收集的使用者貢獻的翻譯,並由 OPUS。該資料集包括按語言對排序的測試和開發資料。它包括數百種語言對的測試集,並且不斷更新。請檢查版本號標籤以引用您正在使用的版本。",
"### Supported Tasks and Leaderboards",
"### Languages\n此資料集涵蓋數百種語言和語言對,並按 ISO-639-1 語言組織。目前版本涵蓋以下語言。繁體中文、英文、日文、韓文、印尼文、越南文、泰文",
"## Dataset Structure",
"### Data Instances\n\n資料以 , 分隔檔案中內容,具有三個欄位:指示、輸入和輸出。請注意,我們並不暗示平移方向,並認為資料集是對稱的並用作兩個方向的測試集。",
"### Data Splits\n先整理出 Train 資料。",
"## Dataset Creation",
"### Curation Rationale\n本資料集將持續更新,未來將公開發佈於 Github 當中。高語言覆蓋率是本計畫的主要目標,資料集的準備與標準化語言標籤和分發格式保持一致和系統化。",
"### Source Data",
"#### Initial Data Collection and Normalization\nMultiCCAligned 資料集是從根據 68 個 Commoncrawl 快照創建的(截至 2020 年 3 月)。根據標點符號將文件分割成句子,並執行重複資料刪除。語料庫的準備工作並沒有提出任何智慧財產權主張。原始發行版可從 URL 取得",
"#### Who are the source language producers?\n這些轉錄本已由 EMNLP'20 作者群製作。",
"### Personal and Sensitive Information\n有關處理個人資訊和敏感資訊的信息,我們請諮詢資料的原始提供者。該資料集未經過任何方式處理以檢測或刪除潛在的敏感資訊或個人資訊。",
"### Social Impact of Dataset\n語言覆蓋率很高,因此它代表了機器翻譯開發的非常有價值的資源,特別是對於資源較少的語言和語言對。不斷成長的資料庫也代表著一種動態資源,其價值將進一步成長。",
"### Other Known Limitations\n這些句子通常很短,因此很容易翻譯。對於高資源語言,這會導致結果不如更具挑戰性的基準有用。對於資源較少的語言對來說,即使在非常具有挑戰性的設定中,範例的有限複雜性實際上也是衡量進度的一件好事。",
"### Dataset Curators\n此資料集由Heng-Shiou Sheu 製作。",
"### Licensing Information\n這些資料集沒使用 License."
] | [
"TAGS\n#task_categories-translation #size_categories-1M<n<10M #language-Twi #language-English #language-Japanese #language-Korean #language-Indonesian #language-Vietnamese #language-Thai #license-mit #MultiCCAligned #translation #OPUS #region-us \n",
"# Dataset Card for [MultiCCAligned-TW-Corpus]",
"## Table of Contents\n- Table of Contents\n- Dataset Description\n - Dataset Summary\n - Supported Tasks and Leaderboards\n - Languages\n- Dataset Structure\n - Data Instances\n - Data Fields\n - Data Splits\n- Dataset Creation\n - Curation Rationale\n - Source Data\n - Annotations\n - Personal and Sensitive Information\n- Considerations for Using the Data\n - Social Impact of Dataset\n - Discussion of Biases\n - Other Known Limitations\n- Additional Information\n - Dataset Curators\n - Licensing Information\n - Citation Information\n - Contributions",
"## Dataset Description\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard:\n- Point of Contact: Heng-Shiou Sheu",
"### Dataset Summary\nMultiCCAligned-TW-Corpus 是一個機器翻譯基準的多語言資料集,源自 OPUS 收集的使用者貢獻的翻譯,並由 OPUS。該資料集包括按語言對排序的測試和開發資料。它包括數百種語言對的測試集,並且不斷更新。請檢查版本號標籤以引用您正在使用的版本。",
"### Supported Tasks and Leaderboards",
"### Languages\n此資料集涵蓋數百種語言和語言對,並按 ISO-639-1 語言組織。目前版本涵蓋以下語言。繁體中文、英文、日文、韓文、印尼文、越南文、泰文",
"## Dataset Structure",
"### Data Instances\n\n資料以 , 分隔檔案中內容,具有三個欄位:指示、輸入和輸出。請注意,我們並不暗示平移方向,並認為資料集是對稱的並用作兩個方向的測試集。",
"### Data Splits\n先整理出 Train 資料。",
"## Dataset Creation",
"### Curation Rationale\n本資料集將持續更新,未來將公開發佈於 Github 當中。高語言覆蓋率是本計畫的主要目標,資料集的準備與標準化語言標籤和分發格式保持一致和系統化。",
"### Source Data",
"#### Initial Data Collection and Normalization\nMultiCCAligned 資料集是從根據 68 個 Commoncrawl 快照創建的(截至 2020 年 3 月)。根據標點符號將文件分割成句子,並執行重複資料刪除。語料庫的準備工作並沒有提出任何智慧財產權主張。原始發行版可從 URL 取得",
"#### Who are the source language producers?\n這些轉錄本已由 EMNLP'20 作者群製作。",
"### Personal and Sensitive Information\n有關處理個人資訊和敏感資訊的信息,我們請諮詢資料的原始提供者。該資料集未經過任何方式處理以檢測或刪除潛在的敏感資訊或個人資訊。",
"### Social Impact of Dataset\n語言覆蓋率很高,因此它代表了機器翻譯開發的非常有價值的資源,特別是對於資源較少的語言和語言對。不斷成長的資料庫也代表著一種動態資源,其價值將進一步成長。",
"### Other Known Limitations\n這些句子通常很短,因此很容易翻譯。對於高資源語言,這會導致結果不如更具挑戰性的基準有用。對於資源較少的語言對來說,即使在非常具有挑戰性的設定中,範例的有限複雜性實際上也是衡量進度的一件好事。",
"### Dataset Curators\n此資料集由Heng-Shiou Sheu 製作。",
"### Licensing Information\n這些資料集沒使用 License."
] |
a50ab827a5fe6e216e4b0b04eb580ba622cf624f |
# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YKM11/Mistral-7B-adaptv1](https://huggingface.co/YKM11/Mistral-7B-adaptv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T17:19:39.634045](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv1/blob/main/results_2024-02-02T17-19-39.634045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6505580965951835,
"acc_stderr": 0.0321558879228405,
"acc_norm": 0.6500251031041236,
"acc_norm_stderr": 0.032828345007388536,
"mc1": 0.609547123623011,
"mc1_stderr": 0.01707823074343146,
"mc2": 0.7455462880169612,
"mc2_stderr": 0.014429048125191489
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7397610921501706,
"acc_norm_stderr": 0.01282193022511257
},
"harness|hellaswag|10": {
"acc": 0.732423819956184,
"acc_stderr": 0.00441790600043053,
"acc_norm": 0.8937462656841266,
"acc_norm_stderr": 0.0030753230104084216
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834845,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834845
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984813,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.01274085387294983,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.01274085387294983
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.609547123623011,
"mc1_stderr": 0.01707823074343146,
"mc2": 0.7455462880169612,
"mc2_stderr": 0.014429048125191489
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760824
},
"harness|gsm8k|5": {
"acc": 0.6664139499620925,
"acc_stderr": 0.012987282131410809
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv1 | [
"region:us"
] | 2024-02-02T17:21:58+00:00 | {"pretty_name": "Evaluation run of YKM11/Mistral-7B-adaptv1", "dataset_summary": "Dataset automatically created during the evaluation run of model [YKM11/Mistral-7B-adaptv1](https://huggingface.co/YKM11/Mistral-7B-adaptv1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T17:19:39.634045](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM11__Mistral-7B-adaptv1/blob/main/results_2024-02-02T17-19-39.634045.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6505580965951835,\n \"acc_stderr\": 0.0321558879228405,\n \"acc_norm\": 0.6500251031041236,\n \"acc_norm_stderr\": 0.032828345007388536,\n \"mc1\": 0.609547123623011,\n \"mc1_stderr\": 0.01707823074343146,\n \"mc2\": 0.7455462880169612,\n \"mc2_stderr\": 0.014429048125191489\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n \"acc_norm\": 0.7397610921501706,\n \"acc_norm_stderr\": 0.01282193022511257\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.732423819956184,\n \"acc_stderr\": 0.00441790600043053,\n \"acc_norm\": 0.8937462656841266,\n \"acc_norm_stderr\": 0.0030753230104084216\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834845,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834845\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984813,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984813\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.01274085387294983,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.01274085387294983\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.609547123623011,\n \"mc1_stderr\": 0.01707823074343146,\n \"mc2\": 0.7455462880169612,\n \"mc2_stderr\": 0.014429048125191489\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760824\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6664139499620925,\n \"acc_stderr\": 0.012987282131410809\n }\n}\n```", "repo_url": "https://huggingface.co/YKM11/Mistral-7B-adaptv1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|arc:challenge|25_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|gsm8k|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hellaswag|10_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T17-19-39.634045.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["**/details_harness|winogrande|5_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T17-19-39.634045.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T17_19_39.634045", "path": ["results_2024-02-02T17-19-39.634045.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T17-19-39.634045.parquet"]}]}]} | 2024-02-02T17:22:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv1
Dataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T17:19:39.634045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv1\n\n\n\nDataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T17:19:39.634045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of YKM11/Mistral-7B-adaptv1\n\n\n\nDataset automatically created during the evaluation run of model YKM11/Mistral-7B-adaptv1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T17:19:39.634045(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f78ee8cd32ffbceb7b0e23adf8246e0d63aa618d | # Wilhelm Busch "Max und Moritz" Dataset
Welcome to the Wilhelm Busch "Max und Moritz" Dataset, a curated collection of 73 public domain images from the classic German children's book "Max und Moritz". This dataset has been enhanced with GPT-Vision generated captions and is ready for training AI models.
[](https://discord.com/invite/m3TBB9XEkb)
## Dataset Overview
- **Content**: The dataset contains 73 images depicting the original illustrations from "Max und Moritz", a tale of two mischievous boys, created by the German humorist, poet, illustrator, and painter Wilhelm Busch.
- **Source**: These images have been carefully selected from various online sources, each offering a glimpse into the 19th-century artwork that has become a staple in German literary culture.
- **Usage**: Ideal for training AI in understanding sequential art narrative, character recognition, and historical illustration styles.
## Licensing
- While the original images by Wilhelm Busch are in the public domain, this particular dataset, complete with its GPT-Vision generated captions, falls under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for the non-commercial use of the images and captions, as long as proper credit is provided.
- More information about this license can be found at [CC BY-NC 2.0 License details](https://creativecommons.org/licenses/by-nc/2.0/).
## Dataset Composition
Each image in this collection is accompanied by a descriptive caption, providing contextual information that can be used to train AI models. The captions are crafted to highlight key elements of the illustrations, aiding in the model's learning process.
## How to Use the Dataset
1. **Download the Dataset**: Access the collection via the provided link for academic and non-commercial research purposes.
2. **Review the Images and Captions**: Examine the illustrations and their respective captions to understand the dataset's range and depth.
3. **Train Your AI Model**: Use the dataset to train AI models in recognizing and generating artwork that reflects the style and narrative techniques of Wilhelm Busch.
## Contributions and Feedback
We appreciate any contributions or feedback aimed at improving the dataset's quality. If you would like to contribute additional images or captions or have suggestions for improvement, please contact us. Your involvement is essential for enhancing this resource for the AI and literary communities.
---
With its historical significance and charm, the Wilhelm Busch "Max und Moritz" Dataset promises to be an invaluable resource for those interested in the intersection of AI, art, and literature. | Blib-la/max_und_moritz_wilhelm_busch_dataset | [
"license:cc-by-nc-2.0",
"region:us"
] | 2024-02-02T18:01:56+00:00 | {"license": "cc-by-nc-2.0", "viewer": false} | 2024-02-03T17:39:01+00:00 | [] | [] | TAGS
#license-cc-by-nc-2.0 #region-us
| # Wilhelm Busch "Max und Moritz" Dataset
Welcome to the Wilhelm Busch "Max und Moritz" Dataset, a curated collection of 73 public domain images from the classic German children's book "Max und Moritz". This dataset has been enhanced with GPT-Vision generated captions and is ready for training AI models.

## Dataset Overview
- Content: The dataset contains 73 images depicting the original illustrations from "Max und Moritz", a tale of two mischievous boys, created by the German humorist, poet, illustrator, and painter Wilhelm Busch.
- Source: These images have been carefully selected from various online sources, each offering a glimpse into the 19th-century artwork that has become a staple in German literary culture.
- Usage: Ideal for training AI in understanding sequential art narrative, character recognition, and historical illustration styles.
## Licensing
- While the original images by Wilhelm Busch are in the public domain, this particular dataset, complete with its GPT-Vision generated captions, falls under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for the non-commercial use of the images and captions, as long as proper credit is provided.
- More information about this license can be found at CC BY-NC 2.0 License details.
## Dataset Composition
Each image in this collection is accompanied by a descriptive caption, providing contextual information that can be used to train AI models. The captions are crafted to highlight key elements of the illustrations, aiding in the model's learning process.
## How to Use the Dataset
1. Download the Dataset: Access the collection via the provided link for academic and non-commercial research purposes.
2. Review the Images and Captions: Examine the illustrations and their respective captions to understand the dataset's range and depth.
3. Train Your AI Model: Use the dataset to train AI models in recognizing and generating artwork that reflects the style and narrative techniques of Wilhelm Busch.
## Contributions and Feedback
We appreciate any contributions or feedback aimed at improving the dataset's quality. If you would like to contribute additional images or captions or have suggestions for improvement, please contact us. Your involvement is essential for enhancing this resource for the AI and literary communities.
---
With its historical significance and charm, the Wilhelm Busch "Max und Moritz" Dataset promises to be an invaluable resource for those interested in the intersection of AI, art, and literature. | [
"# Wilhelm Busch \"Max und Moritz\" Dataset\n\nWelcome to the Wilhelm Busch \"Max und Moritz\" Dataset, a curated collection of 73 public domain images from the classic German children's book \"Max und Moritz\". This dataset has been enhanced with GPT-Vision generated captions and is ready for training AI models.\n\n",
"## Dataset Overview\n\n- Content: The dataset contains 73 images depicting the original illustrations from \"Max und Moritz\", a tale of two mischievous boys, created by the German humorist, poet, illustrator, and painter Wilhelm Busch.\n- Source: These images have been carefully selected from various online sources, each offering a glimpse into the 19th-century artwork that has become a staple in German literary culture.\n- Usage: Ideal for training AI in understanding sequential art narrative, character recognition, and historical illustration styles.",
"## Licensing\n\n- While the original images by Wilhelm Busch are in the public domain, this particular dataset, complete with its GPT-Vision generated captions, falls under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for the non-commercial use of the images and captions, as long as proper credit is provided.\n- More information about this license can be found at CC BY-NC 2.0 License details.",
"## Dataset Composition\n\nEach image in this collection is accompanied by a descriptive caption, providing contextual information that can be used to train AI models. The captions are crafted to highlight key elements of the illustrations, aiding in the model's learning process.",
"## How to Use the Dataset\n\n1. Download the Dataset: Access the collection via the provided link for academic and non-commercial research purposes.\n2. Review the Images and Captions: Examine the illustrations and their respective captions to understand the dataset's range and depth.\n3. Train Your AI Model: Use the dataset to train AI models in recognizing and generating artwork that reflects the style and narrative techniques of Wilhelm Busch.",
"## Contributions and Feedback\n\nWe appreciate any contributions or feedback aimed at improving the dataset's quality. If you would like to contribute additional images or captions or have suggestions for improvement, please contact us. Your involvement is essential for enhancing this resource for the AI and literary communities.\n\n---\n\nWith its historical significance and charm, the Wilhelm Busch \"Max und Moritz\" Dataset promises to be an invaluable resource for those interested in the intersection of AI, art, and literature."
] | [
"TAGS\n#license-cc-by-nc-2.0 #region-us \n",
"# Wilhelm Busch \"Max und Moritz\" Dataset\n\nWelcome to the Wilhelm Busch \"Max und Moritz\" Dataset, a curated collection of 73 public domain images from the classic German children's book \"Max und Moritz\". This dataset has been enhanced with GPT-Vision generated captions and is ready for training AI models.\n\n",
"## Dataset Overview\n\n- Content: The dataset contains 73 images depicting the original illustrations from \"Max und Moritz\", a tale of two mischievous boys, created by the German humorist, poet, illustrator, and painter Wilhelm Busch.\n- Source: These images have been carefully selected from various online sources, each offering a glimpse into the 19th-century artwork that has become a staple in German literary culture.\n- Usage: Ideal for training AI in understanding sequential art narrative, character recognition, and historical illustration styles.",
"## Licensing\n\n- While the original images by Wilhelm Busch are in the public domain, this particular dataset, complete with its GPT-Vision generated captions, falls under the Creative Commons Attribution-NonCommercial 2.0 Generic (CC BY-NC 2.0) license. This license allows for the non-commercial use of the images and captions, as long as proper credit is provided.\n- More information about this license can be found at CC BY-NC 2.0 License details.",
"## Dataset Composition\n\nEach image in this collection is accompanied by a descriptive caption, providing contextual information that can be used to train AI models. The captions are crafted to highlight key elements of the illustrations, aiding in the model's learning process.",
"## How to Use the Dataset\n\n1. Download the Dataset: Access the collection via the provided link for academic and non-commercial research purposes.\n2. Review the Images and Captions: Examine the illustrations and their respective captions to understand the dataset's range and depth.\n3. Train Your AI Model: Use the dataset to train AI models in recognizing and generating artwork that reflects the style and narrative techniques of Wilhelm Busch.",
"## Contributions and Feedback\n\nWe appreciate any contributions or feedback aimed at improving the dataset's quality. If you would like to contribute additional images or captions or have suggestions for improvement, please contact us. Your involvement is essential for enhancing this resource for the AI and literary communities.\n\n---\n\nWith its historical significance and charm, the Wilhelm Busch \"Max und Moritz\" Dataset promises to be an invaluable resource for those interested in the intersection of AI, art, and literature."
] |
e214ccc5fc27fd4bf4cc913efe0c142349ce04f7 |
# Dataset Card for Evaluation run of BFauber/opt350m_10e6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt350m_10e6](https://huggingface.co/BFauber/opt350m_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt350m_10e6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:07:15.412777](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt350m_10e6/blob/main/results_2024-02-02T18-07-15.412777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24971009497879715,
"acc_stderr": 0.030551769349442866,
"acc_norm": 0.2506832642322296,
"acc_norm_stderr": 0.031358540628084804,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.4671195481529295,
"mc2_stderr": 0.015802968745785427
},
"harness|arc:challenge|25": {
"acc": 0.22013651877133106,
"acc_stderr": 0.01210812488346098,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453956
},
"harness|hellaswag|10": {
"acc": 0.2817167894841665,
"acc_stderr": 0.004489166767430648,
"acc_norm": 0.3236407090221072,
"acc_norm_stderr": 0.004669085411342183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749884,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749884
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.02655698211783873,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.02655698211783873
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.021227082449445062,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.021227082449445062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.02472071319395216,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.02472071319395216
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.02772206549336127,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.02772206549336127
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987055,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987055
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1871559633027523,
"acc_stderr": 0.01672268452620016,
"acc_norm": 0.1871559633027523,
"acc_norm_stderr": 0.01672268452620016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.02596742095825853,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.02596742095825853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695063,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695063
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.02758406660220827,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.02758406660220827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.01581845089477756,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.01581845089477756
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.01431099954796143,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.01431099954796143
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417355,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596452,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596452
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322253,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322253
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834838,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834838
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594688,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594688
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862666,
"mc2": 0.4671195481529295,
"mc2_stderr": 0.015802968745785427
},
"harness|winogrande|5": {
"acc": 0.5035516969218626,
"acc_stderr": 0.014052131146915848
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt350m_10e6 | [
"region:us"
] | 2024-02-02T18:08:57+00:00 | {"pretty_name": "Evaluation run of BFauber/opt350m_10e6", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt350m_10e6](https://huggingface.co/BFauber/opt350m_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt350m_10e6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:07:15.412777](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt350m_10e6/blob/main/results_2024-02-02T18-07-15.412777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24971009497879715,\n \"acc_stderr\": 0.030551769349442866,\n \"acc_norm\": 0.2506832642322296,\n \"acc_norm_stderr\": 0.031358540628084804,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.4671195481529295,\n \"mc2_stderr\": 0.015802968745785427\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22013651877133106,\n \"acc_stderr\": 0.01210812488346098,\n \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453956\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2817167894841665,\n \"acc_stderr\": 0.004489166767430648,\n \"acc_norm\": 0.3236407090221072,\n \"acc_norm_stderr\": 0.004669085411342183\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891363,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891363\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749884,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749884\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.02655698211783873,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.02655698211783873\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21693121693121692,\n \"acc_stderr\": 0.021227082449445062,\n \"acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.021227082449445062\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2074074074074074,\n \"acc_stderr\": 0.02472071319395216,\n \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.02472071319395216\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.02772206549336127,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.02772206549336127\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987055,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987055\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1871559633027523,\n \"acc_stderr\": 0.01672268452620016,\n \"acc_norm\": 0.1871559633027523,\n \"acc_norm_stderr\": 0.01672268452620016\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.17592592592592593,\n \"acc_stderr\": 0.02596742095825853,\n \"acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.02596742095825853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695063,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695063\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n \"acc_stderr\": 0.02758406660220827,\n \"acc_norm\": 0.21524663677130046,\n \"acc_norm_stderr\": 0.02758406660220827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n \"acc_stderr\": 0.01581845089477756,\n \"acc_norm\": 0.2669220945083014,\n \"acc_norm_stderr\": 0.01581845089477756\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.01431099954796143,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.01431099954796143\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n \"acc_stderr\": 0.010956556654417355,\n \"acc_norm\": 0.24315514993481094,\n \"acc_norm_stderr\": 0.010956556654417355\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596452,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596452\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322253,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322253\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.02992941540834838,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.02992941540834838\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n \"acc_stderr\": 0.03240004825594688,\n \"acc_norm\": 0.22289156626506024,\n \"acc_norm_stderr\": 0.03240004825594688\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862666,\n \"mc2\": 0.4671195481529295,\n \"mc2_stderr\": 0.015802968745785427\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5035516969218626,\n \"acc_stderr\": 0.014052131146915848\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt350m_10e6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-07-15.412777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["**/details_harness|winogrande|5_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-07-15.412777.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_07_15.412777", "path": ["results_2024-02-02T18-07-15.412777.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-07-15.412777.parquet"]}]}]} | 2024-02-02T18:09:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt350m_10e6
Dataset automatically created during the evaluation run of model BFauber/opt350m_10e6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:07:15.412777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt350m_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt350m_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:07:15.412777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt350m_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt350m_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:07:15.412777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5040a01bc76b256007f372c24dc073615dcba79e |
# Dataset Card for Evaluation run of BFauber/opt350m_10e5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt350m_10e5](https://huggingface.co/BFauber/opt350m_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt350m_10e5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:12:11.233517](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt350m_10e5/blob/main/results_2024-02-02T18-12-11.233517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26007171709591076,
"acc_stderr": 0.030742648583097,
"acc_norm": 0.26145245996407884,
"acc_norm_stderr": 0.03155843666482224,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104184,
"mc2": 0.4217310351619178,
"mc2_stderr": 0.014805308910592625
},
"harness|arc:challenge|25": {
"acc": 0.21416382252559726,
"acc_stderr": 0.011988383205966483,
"acc_norm": 0.24146757679180889,
"acc_norm_stderr": 0.012506564839739432
},
"harness|hellaswag|10": {
"acc": 0.31428002389962156,
"acc_stderr": 0.00463279737528977,
"acc_norm": 0.3652658832901812,
"acc_norm_stderr": 0.0048052057987245595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.036001056927277716,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.036001056927277716
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.0259885007924119,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.0259885007924119
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529297,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529297
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03274287914026867,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03274287914026867
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.03458816042181005,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.03458816042181005
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941183,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941183
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609546,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609546
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501964,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501964
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.22362869198312235,
"acc_stderr": 0.027123298205229972,
"acc_norm": 0.22362869198312235,
"acc_norm_stderr": 0.027123298205229972
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.03050028317654591,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.03050028317654591
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541187,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541187
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21966794380587484,
"acc_stderr": 0.014805384478371176,
"acc_norm": 0.21966794380587484,
"acc_norm_stderr": 0.014805384478371176
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.024547617794803838,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.024547617794803838
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.014487500852850423,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.014487500852850423
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.023350225475471414,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.023350225475471414
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266722,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266722
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.01099615663514269,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.01099615663514269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.027212835884073156,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.027212835884073156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296014,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.0317555478662992,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.0317555478662992
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.01489627744104184,
"mc2": 0.4217310351619178,
"mc2_stderr": 0.014805308910592625
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.01404439040161298
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt350m_10e5 | [
"region:us"
] | 2024-02-02T18:13:53+00:00 | {"pretty_name": "Evaluation run of BFauber/opt350m_10e5", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt350m_10e5](https://huggingface.co/BFauber/opt350m_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt350m_10e5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:12:11.233517](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt350m_10e5/blob/main/results_2024-02-02T18-12-11.233517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26007171709591076,\n \"acc_stderr\": 0.030742648583097,\n \"acc_norm\": 0.26145245996407884,\n \"acc_norm_stderr\": 0.03155843666482224,\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.01489627744104184,\n \"mc2\": 0.4217310351619178,\n \"mc2_stderr\": 0.014805308910592625\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21416382252559726,\n \"acc_stderr\": 0.011988383205966483,\n \"acc_norm\": 0.24146757679180889,\n \"acc_norm_stderr\": 0.012506564839739432\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.31428002389962156,\n \"acc_stderr\": 0.00463279737528977,\n \"acc_norm\": 0.3652658832901812,\n \"acc_norm_stderr\": 0.0048052057987245595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263714,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263714\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.036001056927277716,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.036001056927277716\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n \"acc_stderr\": 0.0259885007924119,\n \"acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.0259885007924119\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529297,\n \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529297\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03274287914026867,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03274287914026867\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.03458816042181005,\n \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.03458816042181005\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941183,\n \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941183\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609546,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609546\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3394495412844037,\n \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.3394495412844037,\n \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501964,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501964\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.22362869198312235,\n \"acc_stderr\": 0.027123298205229972,\n \"acc_norm\": 0.22362869198312235,\n \"acc_norm_stderr\": 0.027123298205229972\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n \"acc_stderr\": 0.03050028317654591,\n \"acc_norm\": 0.2914798206278027,\n \"acc_norm_stderr\": 0.03050028317654591\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541187,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541187\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21966794380587484,\n \"acc_stderr\": 0.014805384478371176,\n \"acc_norm\": 0.21966794380587484,\n \"acc_norm_stderr\": 0.014805384478371176\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.024547617794803838,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.024547617794803838\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.014487500852850423,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.014487500852850423\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n \"acc_stderr\": 0.023350225475471414,\n \"acc_norm\": 0.21543408360128619,\n \"acc_norm_stderr\": 0.023350225475471414\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266722,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266722\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.027212835884073156,\n \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.027212835884073156\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.030769444967296014,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.030769444967296014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.0317555478662992,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.0317555478662992\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n \"mc1_stderr\": 0.01489627744104184,\n \"mc2\": 0.4217310351619178,\n \"mc2_stderr\": 0.014805308910592625\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n \"acc_stderr\": 0.01404439040161298\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt350m_10e5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-12-11.233517.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["**/details_harness|winogrande|5_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-12-11.233517.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_12_11.233517", "path": ["results_2024-02-02T18-12-11.233517.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-12-11.233517.parquet"]}]}]} | 2024-02-02T18:14:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt350m_10e5
Dataset automatically created during the evaluation run of model BFauber/opt350m_10e5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:12:11.233517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt350m_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt350m_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:12:11.233517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt350m_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt350m_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:12:11.233517(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7d99e0b5bcca52ce141dda00fb1ee70e21ea1aaa |
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-004
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-004](https://huggingface.co/ConvexAI/BurningBruce-004) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__BurningBruce-004",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:13:25.503576](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-004/blob/main/results_2024-02-02T18-13-25.503576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532196328486137,
"acc_stderr": 0.03212525213314176,
"acc_norm": 0.6523598486584207,
"acc_norm_stderr": 0.03279924819889533,
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6839428568931519,
"mc2_stderr": 0.0152008998758035
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520769,
"acc_norm": 0.7329351535836177,
"acc_norm_stderr": 0.012928933196496364
},
"harness|hellaswag|10": {
"acc": 0.7200756821350328,
"acc_stderr": 0.004480442446762916,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.003168249351889309
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.03514942551267439,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.03514942551267439
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374308,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374308
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.0134682016140663,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.0134682016140663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5520195838433293,
"mc1_stderr": 0.017408513063422917,
"mc2": 0.6839428568931519,
"mc2_stderr": 0.0152008998758035
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.7058377558756633,
"acc_stderr": 0.012551285331470152
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__BurningBruce-004 | [
"region:us"
] | 2024-02-02T18:15:45+00:00 | {"pretty_name": "Evaluation run of ConvexAI/BurningBruce-004", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-004](https://huggingface.co/ConvexAI/BurningBruce-004) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__BurningBruce-004\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:13:25.503576](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-004/blob/main/results_2024-02-02T18-13-25.503576.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532196328486137,\n \"acc_stderr\": 0.03212525213314176,\n \"acc_norm\": 0.6523598486584207,\n \"acc_norm_stderr\": 0.03279924819889533,\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6839428568931519,\n \"mc2_stderr\": 0.0152008998758035\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520769,\n \"acc_norm\": 0.7329351535836177,\n \"acc_norm_stderr\": 0.012928933196496364\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7200756821350328,\n \"acc_stderr\": 0.004480442446762916,\n \"acc_norm\": 0.8862776339374626,\n \"acc_norm_stderr\": 0.003168249351889309\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.03514942551267439,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.03514942551267439\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374308,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374308\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.0134682016140663,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.0134682016140663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5520195838433293,\n \"mc1_stderr\": 0.017408513063422917,\n \"mc2\": 0.6839428568931519,\n \"mc2_stderr\": 0.0152008998758035\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7058377558756633,\n \"acc_stderr\": 0.012551285331470152\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/BurningBruce-004", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["**/details_harness|winogrande|5_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-13-25.503576.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_13_25.503576", "path": ["results_2024-02-02T18-13-25.503576.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-13-25.503576.parquet"]}]}]} | 2024-02-02T18:16:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-004
Dataset automatically created during the evaluation run of model ConvexAI/BurningBruce-004 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:13:25.503576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-004\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-004 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:13:25.503576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-004\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-004 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:13:25.503576(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d6452349be126871c0fd911a336aaa659bf9ecbe |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5](https://huggingface.co/BFauber/opt125m_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:15:22.579177](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5/blob/main/results_2024-02-02T18-15-22.579177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26366308643561104,
"acc_stderr": 0.031026830173493627,
"acc_norm": 0.26504164089701815,
"acc_norm_stderr": 0.03185660153562,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.43916764945895703,
"mc2_stderr": 0.015165178906548546
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.24658703071672355,
"acc_norm_stderr": 0.01259572626879014
},
"harness|hellaswag|10": {
"acc": 0.28719378609838675,
"acc_stderr": 0.004515280911468836,
"acc_norm": 0.31228838876717785,
"acc_norm_stderr": 0.004624796348128796
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.039154506304142495,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.039154506304142495
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302052,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302052
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782405,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782405
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414359,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414359
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30256410256410254,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.30256410256410254,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.03445406271987054,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.03445406271987054
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25688073394495414,
"acc_stderr": 0.01873249292834247,
"acc_norm": 0.25688073394495414,
"acc_norm_stderr": 0.01873249292834247
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.025403832978179604,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.025403832978179604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886338,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886338
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.02624492034984301,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.02624492034984301
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.016906615927288145,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.016906615927288145
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.02520696315422538,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.02520696315422538
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.43916764945895703,
"mc2_stderr": 0.015165178906548546
},
"harness|winogrande|5": {
"acc": 0.5146014206787688,
"acc_stderr": 0.014046492383275837
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5 | [
"region:us"
] | 2024-02-02T18:17:07+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5](https://huggingface.co/BFauber/opt125m_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:15:22.579177](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5/blob/main/results_2024-02-02T18-15-22.579177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26366308643561104,\n \"acc_stderr\": 0.031026830173493627,\n \"acc_norm\": 0.26504164089701815,\n \"acc_norm_stderr\": 0.03185660153562,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.43916764945895703,\n \"mc2_stderr\": 0.015165178906548546\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n \"acc_norm\": 0.24658703071672355,\n \"acc_norm_stderr\": 0.01259572626879014\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28719378609838675,\n \"acc_stderr\": 0.004515280911468836,\n \"acc_norm\": 0.31228838876717785,\n \"acc_norm_stderr\": 0.004624796348128796\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.039154506304142495,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.039154506304142495\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n \"acc_stderr\": 0.03512207412302052,\n \"acc_norm\": 0.19047619047619047,\n \"acc_norm_stderr\": 0.03512207412302052\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782405,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782405\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414359,\n \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414359\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.30256410256410254,\n \"acc_stderr\": 0.02329088805377272,\n \"acc_norm\": 0.30256410256410254,\n \"acc_norm_stderr\": 0.02329088805377272\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.03445406271987054,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.03445406271987054\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25688073394495414,\n \"acc_stderr\": 0.01873249292834247,\n \"acc_norm\": 0.25688073394495414,\n \"acc_norm_stderr\": 0.01873249292834247\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.23499361430395913,\n \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n \"acc_stderr\": 0.025403832978179604,\n \"acc_norm\": 0.2765273311897106,\n \"acc_norm_stderr\": 0.025403832978179604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886338,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886338\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.02624492034984301,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.02624492034984301\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.016906615927288145,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.016906615927288145\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.02520696315422538,\n \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.02520696315422538\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.43916764945895703,\n \"mc2_stderr\": 0.015165178906548546\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5146014206787688,\n \"acc_stderr\": 0.014046492383275837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-15-22.579177.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["**/details_harness|winogrande|5_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-15-22.579177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_15_22.579177", "path": ["results_2024-02-02T18-15-22.579177.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-15-22.579177.parquet"]}]}]} | 2024-02-02T18:17:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:15:22.579177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:15:22.579177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:15:22.579177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
68c0a234467eff3b49015b3bfa84074fea875488 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e6_run1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e6_run1](https://huggingface.co/BFauber/opt125m_10e6_run1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e6_run1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:19:34.951673](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e6_run1/blob/main/results_2024-02-02T18-19-34.951673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2453956177453566,
"acc_stderr": 0.03035774790592599,
"acc_norm": 0.24574841257866145,
"acc_norm_stderr": 0.031160600953299776,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.48593837171548643,
"mc2_stderr": 0.01578462194827542
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406455,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453956
},
"harness|hellaswag|10": {
"acc": 0.27693686516630156,
"acc_stderr": 0.00446570481089354,
"acc_norm": 0.29794861581358295,
"acc_norm_stderr": 0.004564220870531578
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.13725490196078433,
"acc_stderr": 0.034240846698915216,
"acc_norm": 0.13725490196078433,
"acc_norm_stderr": 0.034240846698915216
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332215,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332215
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148547,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722734,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1712962962962963,
"acc_stderr": 0.025695341643824685,
"acc_norm": 0.1712962962962963,
"acc_norm_stderr": 0.025695341643824685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2062780269058296,
"acc_stderr": 0.027157150479563824,
"acc_norm": 0.2062780269058296,
"acc_norm_stderr": 0.027157150479563824
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.183206106870229,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.183206106870229,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.015866243073215054,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.015866243073215054
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694888,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.02301670564026219,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.02301670564026219
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307713,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307713
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101143,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.15441176470588236,
"acc_stderr": 0.021950024722922026,
"acc_norm": 0.15441176470588236,
"acc_norm_stderr": 0.021950024722922026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353378,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.03070982405056527,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.03070982405056527
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.48593837171548643,
"mc2_stderr": 0.01578462194827542
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.014039239216484627
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e6_run1 | [
"region:us"
] | 2024-02-02T18:21:24+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e6_run1", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e6_run1](https://huggingface.co/BFauber/opt125m_10e6_run1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e6_run1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:19:34.951673](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e6_run1/blob/main/results_2024-02-02T18-19-34.951673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2453956177453566,\n \"acc_stderr\": 0.03035774790592599,\n \"acc_norm\": 0.24574841257866145,\n \"acc_norm_stderr\": 0.031160600953299776,\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.48593837171548643,\n \"mc2_stderr\": 0.01578462194827542\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406455,\n \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453956\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27693686516630156,\n \"acc_stderr\": 0.00446570481089354,\n \"acc_norm\": 0.29794861581358295,\n \"acc_norm_stderr\": 0.004564220870531578\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.025907897122408173,\n \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.025907897122408173\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.13725490196078433,\n \"acc_stderr\": 0.034240846698915216,\n \"acc_norm\": 0.13725490196078433,\n \"acc_norm_stderr\": 0.034240846698915216\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332215,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332215\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124488,\n \"acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124488\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148547,\n \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722734,\n \"acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722734\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1712962962962963,\n \"acc_stderr\": 0.025695341643824685,\n \"acc_norm\": 0.1712962962962963,\n \"acc_norm_stderr\": 0.025695341643824685\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2062780269058296,\n \"acc_stderr\": 0.027157150479563824,\n \"acc_norm\": 0.2062780269058296,\n \"acc_norm_stderr\": 0.027157150479563824\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.183206106870229,\n \"acc_stderr\": 0.03392770926494733,\n \"acc_norm\": 0.183206106870229,\n \"acc_norm_stderr\": 0.03392770926494733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.015866243073215054,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.015866243073215054\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.025494259350694888,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.025494259350694888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.02301670564026219,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.02301670564026219\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307713,\n \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307713\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n \"acc_stderr\": 0.011111715336101143,\n \"acc_norm\": 0.25358539765319427,\n \"acc_norm_stderr\": 0.011111715336101143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.15441176470588236,\n \"acc_stderr\": 0.021950024722922026,\n \"acc_norm\": 0.15441176470588236,\n \"acc_norm_stderr\": 0.021950024722922026\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.036942843353378,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.036942843353378\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n \"acc_stderr\": 0.03070982405056527,\n \"acc_norm\": 0.1927710843373494,\n \"acc_norm_stderr\": 0.03070982405056527\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.48593837171548643,\n \"mc2_stderr\": 0.01578462194827542\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n \"acc_stderr\": 0.014039239216484627\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e6_run1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["**/details_harness|winogrande|5_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-19-34.951673.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_19_34.951673", "path": ["results_2024-02-02T18-19-34.951673.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-19-34.951673.parquet"]}]}]} | 2024-02-02T18:21:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e6_run1
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e6_run1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:19:34.951673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e6_run1\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e6_run1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:19:34.951673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e6_run1\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e6_run1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:19:34.951673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a1f9bbbf4a2889ff50f0e1f9056bce0a03b4a5ef |
# Dataset Card for Evaluation run of BFauber/bloom-1b1_10e6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/bloom-1b1_10e6](https://huggingface.co/BFauber/bloom-1b1_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__bloom-1b1_10e6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:24:13.797683](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__bloom-1b1_10e6/blob/main/results_2024-02-02T18-24-13.797683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2552182830483191,
"acc_stderr": 0.030785132717145614,
"acc_norm": 0.2563196958167794,
"acc_norm_stderr": 0.03160376478957473,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.4440025990339484,
"mc2_stderr": 0.015462178793637952
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202607,
"acc_norm": 0.25426621160409557,
"acc_norm_stderr": 0.01272499994515773
},
"harness|hellaswag|10": {
"acc": 0.31228838876717785,
"acc_stderr": 0.004624796348128794,
"acc_norm": 0.3712407886875124,
"acc_norm_stderr": 0.0048214929940821345
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178256,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178256
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22201834862385322,
"acc_stderr": 0.01781884956479663,
"acc_norm": 0.22201834862385322,
"acc_norm_stderr": 0.01781884956479663
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598018,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598018
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.29190751445086704,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.29190751445086704,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26727509778357234,
"acc_stderr": 0.011302607515637506,
"acc_norm": 0.26727509778357234,
"acc_norm_stderr": 0.011302607515637506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16544117647058823,
"acc_stderr": 0.022571771025494767,
"acc_norm": 0.16544117647058823,
"acc_norm_stderr": 0.022571771025494767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2826797385620915,
"acc_stderr": 0.01821726955205343,
"acc_norm": 0.2826797385620915,
"acc_norm_stderr": 0.01821726955205343
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062133,
"mc2": 0.4440025990339484,
"mc2_stderr": 0.015462178793637952
},
"harness|winogrande|5": {
"acc": 0.5351223362273086,
"acc_stderr": 0.014017773120881587
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__bloom-1b1_10e6 | [
"region:us"
] | 2024-02-02T18:25:06+00:00 | {"pretty_name": "Evaluation run of BFauber/bloom-1b1_10e6", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/bloom-1b1_10e6](https://huggingface.co/BFauber/bloom-1b1_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__bloom-1b1_10e6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:24:13.797683](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__bloom-1b1_10e6/blob/main/results_2024-02-02T18-24-13.797683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2552182830483191,\n \"acc_stderr\": 0.030785132717145614,\n \"acc_norm\": 0.2563196958167794,\n \"acc_norm_stderr\": 0.03160376478957473,\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.4440025990339484,\n \"mc2_stderr\": 0.015462178793637952\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202607,\n \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.01272499994515773\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.31228838876717785,\n \"acc_stderr\": 0.004624796348128794,\n \"acc_norm\": 0.3712407886875124,\n \"acc_norm_stderr\": 0.0048214929940821345\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178256,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178256\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22201834862385322,\n \"acc_stderr\": 0.01781884956479663,\n \"acc_norm\": 0.22201834862385322,\n \"acc_norm_stderr\": 0.01781884956479663\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598018,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598018\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.29190751445086704,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.29190751445086704,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.2990353697749196,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26727509778357234,\n \"acc_stderr\": 0.011302607515637506,\n \"acc_norm\": 0.26727509778357234,\n \"acc_norm_stderr\": 0.011302607515637506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16544117647058823,\n \"acc_stderr\": 0.022571771025494767,\n \"acc_norm\": 0.16544117647058823,\n \"acc_norm_stderr\": 0.022571771025494767\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2826797385620915,\n \"acc_stderr\": 0.01821726955205343,\n \"acc_norm\": 0.2826797385620915,\n \"acc_norm_stderr\": 0.01821726955205343\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.014948812679062133,\n \"mc2\": 0.4440025990339484,\n \"mc2_stderr\": 0.015462178793637952\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5351223362273086,\n \"acc_stderr\": 0.014017773120881587\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/bloom-1b1_10e6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-24-13.797683.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["**/details_harness|winogrande|5_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-24-13.797683.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_24_13.797683", "path": ["results_2024-02-02T18-24-13.797683.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-24-13.797683.parquet"]}]}]} | 2024-02-02T18:25:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/bloom-1b1_10e6
Dataset automatically created during the evaluation run of model BFauber/bloom-1b1_10e6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:24:13.797683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/bloom-1b1_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/bloom-1b1_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:24:13.797683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/bloom-1b1_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/bloom-1b1_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:24:13.797683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
910ae68e43ddb440593d8c2d3c98058addd6d189 |
这个数据集从 grimulkan/bluemoon_Karen_cleaned 中抽取了所有包含NSFW内容的对话(只保留第一轮对话)。
This dataset extracts all conversations containing NSFW content from `grimulkan/bluemoon_Karen_cleaned` (only the first round of conversations is retained)
Explanation of long_response:
```python
if len(chosen) > len(prompt):
long_response = 1
``` | tastypear/bluemoon-cleaned-lewd | [
"task_categories:text-generation",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"region:us"
] | 2024-02-02T18:31:56+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["text-generation"], "tags": ["not-for-all-audiences"]} | 2024-02-02T18:41:01+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #language-English #license-apache-2.0 #not-for-all-audiences #region-us
|
这个数据集从 grimulkan/bluemoon_Karen_cleaned 中抽取了所有包含NSFW内容的对话(只保留第一轮对话)。
This dataset extracts all conversations containing NSFW content from 'grimulkan/bluemoon_Karen_cleaned' (only the first round of conversations is retained)
Explanation of long_response:
| [] | [
"TAGS\n#task_categories-text-generation #language-English #license-apache-2.0 #not-for-all-audiences #region-us \n"
] |
7d1e948a3ac43269f7670c8ef68ccb9db7d05348 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e4](https://huggingface.co/BFauber/opt125m_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:39:22.964015](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e4/blob/main/results_2024-02-02T18-39-22.964015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2653997971375558,
"acc_stderr": 0.03091751185138889,
"acc_norm": 0.2667186188669961,
"acc_norm_stderr": 0.03173676795406758,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4287951061339734,
"mc2_stderr": 0.014935297274427089
},
"harness|arc:challenge|25": {
"acc": 0.20648464163822525,
"acc_stderr": 0.011828865619002316,
"acc_norm": 0.2295221843003413,
"acc_norm_stderr": 0.012288926760890797
},
"harness|hellaswag|10": {
"acc": 0.2877912766381199,
"acc_stderr": 0.004518080594528024,
"acc_norm": 0.3090021907986457,
"acc_norm_stderr": 0.004611377019520813
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727772,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727772
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03010833071801162,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03010833071801162
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121633,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121633
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638907,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.1940928270042194,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.1940928270042194,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.15178571428571427,
"acc_stderr": 0.03405702838185694,
"acc_norm": 0.15178571428571427,
"acc_norm_stderr": 0.03405702838185694
},
"harness|hendrycksTest-management|5": {
"acc": 0.34951456310679613,
"acc_stderr": 0.04721188506097173,
"acc_norm": 0.34951456310679613,
"acc_norm_stderr": 0.04721188506097173
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20306513409961685,
"acc_stderr": 0.014385525076611581,
"acc_norm": 0.20306513409961685,
"acc_norm_stderr": 0.014385525076611581
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819743,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819743
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573026,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573026
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.4287951061339734,
"mc2_stderr": 0.014935297274427089
},
"harness|winogrande|5": {
"acc": 0.4972375690607735,
"acc_stderr": 0.014052271211616448
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e4 | [
"region:us"
] | 2024-02-02T18:41:09+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e4", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e4](https://huggingface.co/BFauber/opt125m_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:39:22.964015](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e4/blob/main/results_2024-02-02T18-39-22.964015.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2653997971375558,\n \"acc_stderr\": 0.03091751185138889,\n \"acc_norm\": 0.2667186188669961,\n \"acc_norm_stderr\": 0.03173676795406758,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4287951061339734,\n \"mc2_stderr\": 0.014935297274427089\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20648464163822525,\n \"acc_stderr\": 0.011828865619002316,\n \"acc_norm\": 0.2295221843003413,\n \"acc_norm_stderr\": 0.012288926760890797\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2877912766381199,\n \"acc_stderr\": 0.004518080594528024,\n \"acc_norm\": 0.3090021907986457,\n \"acc_norm_stderr\": 0.004611377019520813\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727772,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727772\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03010833071801162,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03010833071801162\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121633,\n \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121633\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638907,\n \"acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.1940928270042194,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.1940928270042194,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.15178571428571427,\n \"acc_stderr\": 0.03405702838185694,\n \"acc_norm\": 0.15178571428571427,\n \"acc_norm_stderr\": 0.03405702838185694\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.34951456310679613,\n \"acc_stderr\": 0.04721188506097173,\n \"acc_norm\": 0.34951456310679613,\n \"acc_norm_stderr\": 0.04721188506097173\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20306513409961685,\n \"acc_stderr\": 0.014385525076611581,\n \"acc_norm\": 0.20306513409961685,\n \"acc_norm_stderr\": 0.014385525076611581\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819743,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819743\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n \"acc_stderr\": 0.030965903123573026,\n \"acc_norm\": 0.25870646766169153,\n \"acc_norm_stderr\": 0.030965903123573026\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.4287951061339734,\n \"mc2_stderr\": 0.014935297274427089\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4972375690607735,\n \"acc_stderr\": 0.014052271211616448\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["**/details_harness|winogrande|5_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-39-22.964015.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_39_22.964015", "path": ["results_2024-02-02T18-39-22.964015.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-39-22.964015.parquet"]}]}]} | 2024-02-02T18:41:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e4
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:39:22.964015(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e4\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:39:22.964015(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e4\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:39:22.964015(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
56c32588db3ff3dc5ccadb9e69e6ce4e141c01dc |
# Dataset Card for Evaluation run of BFauber/opt125m_10e3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e3](https://huggingface.co/BFauber/opt125m_10e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:42:28.559578](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e3/blob/main/results_2024-02-02T18-42-28.559578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26568540943150076,
"acc_stderr": 0.03096541406499521,
"acc_norm": 0.26668743906699194,
"acc_norm_stderr": 0.03178688168143971,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.4251967814899003,
"mc2_stderr": 0.014868030438434513
},
"harness|arc:challenge|25": {
"acc": 0.20477815699658702,
"acc_stderr": 0.011792544338513393,
"acc_norm": 0.22866894197952217,
"acc_norm_stderr": 0.012272853582540792
},
"harness|hellaswag|10": {
"acc": 0.2876916948814977,
"acc_stderr": 0.0045176146477032475,
"acc_norm": 0.31009759012148974,
"acc_norm_stderr": 0.004615880352799732
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.035834961763610625,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.035834961763610625
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.35319148936170214,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.35319148936170214,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924314,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924314
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358611,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358611
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03358618145732523,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03358618145732523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3229357798165138,
"acc_stderr": 0.02004811592341533,
"acc_norm": 0.3229357798165138,
"acc_norm_stderr": 0.02004811592341533
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.020799400082880008,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.020799400082880008
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.041032038305145124,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.041032038305145124
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2737430167597765,
"acc_stderr": 0.014912413096372428,
"acc_norm": 0.2737430167597765,
"acc_norm_stderr": 0.014912413096372428
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.026004800363952113
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.20987654320987653,
"acc_stderr": 0.022658344085981375,
"acc_norm": 0.20987654320987653,
"acc_norm_stderr": 0.022658344085981375
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.02646903681859062,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.02646903681859062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.011139857833598521,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.011139857833598521
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.016639319350313264,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.016639319350313264
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078943,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078943
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.4251967814899003,
"mc2_stderr": 0.014868030438434513
},
"harness|winogrande|5": {
"acc": 0.5185477505919495,
"acc_stderr": 0.014042813708888378
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e3 | [
"region:us"
] | 2024-02-02T18:44:11+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e3", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e3](https://huggingface.co/BFauber/opt125m_10e3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:42:28.559578](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e3/blob/main/results_2024-02-02T18-42-28.559578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26568540943150076,\n \"acc_stderr\": 0.03096541406499521,\n \"acc_norm\": 0.26668743906699194,\n \"acc_norm_stderr\": 0.03178688168143971,\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.4251967814899003,\n \"mc2_stderr\": 0.014868030438434513\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20477815699658702,\n \"acc_stderr\": 0.011792544338513393,\n \"acc_norm\": 0.22866894197952217,\n \"acc_norm_stderr\": 0.012272853582540792\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2876916948814977,\n \"acc_stderr\": 0.0045176146477032475,\n \"acc_norm\": 0.31009759012148974,\n \"acc_norm_stderr\": 0.004615880352799732\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.035834961763610625,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.035834961763610625\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.35319148936170214,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.35319148936170214,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924314,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924314\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358611,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358611\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03358618145732523,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03358618145732523\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3229357798165138,\n \"acc_stderr\": 0.02004811592341533,\n \"acc_norm\": 0.3229357798165138,\n \"acc_norm_stderr\": 0.02004811592341533\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.23628691983122363,\n \"acc_stderr\": 0.02765215314415927,\n \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.02765215314415927\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.020799400082880008,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.020799400082880008\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.041032038305145124,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.041032038305145124\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2737430167597765,\n \"acc_stderr\": 0.014912413096372428,\n \"acc_norm\": 0.2737430167597765,\n \"acc_norm_stderr\": 0.014912413096372428\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113,\n \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.026004800363952113\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.20987654320987653,\n \"acc_stderr\": 0.022658344085981375,\n \"acc_norm\": 0.20987654320987653,\n \"acc_norm_stderr\": 0.022658344085981375\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2695035460992908,\n \"acc_stderr\": 0.02646903681859062,\n \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.02646903681859062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n \"acc_stderr\": 0.011139857833598521,\n \"acc_norm\": 0.25554106910039115,\n \"acc_norm_stderr\": 0.011139857833598521\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.016639319350313264,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.016639319350313264\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.03106939026078943,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.03106939026078943\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.4251967814899003,\n \"mc2_stderr\": 0.014868030438434513\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5185477505919495,\n \"acc_stderr\": 0.014042813708888378\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-42-28.559578.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["**/details_harness|winogrande|5_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-42-28.559578.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_42_28.559578", "path": ["results_2024-02-02T18-42-28.559578.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-42-28.559578.parquet"]}]}]} | 2024-02-02T18:44:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e3
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:42:28.559578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e3\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:42:28.559578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e3\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:42:28.559578(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8c2b1b56e706346563d4e96b9628c4789ac0d313 | Dataset using the bert-cased tokenizer, cutoff sentences to 128 length (not sentence pairs), all sentence pairs extracted.
Original datasets:
- https://huggingface.co/datasets/bookcorpus
- https://huggingface.co/datasets/wikipedia Variant: 20220301.en | gmongaras/BERT_Base_Cased_128_Dataset | [
"region:us"
] | 2024-02-02T18:50:34+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 34602046700, "num_examples": 134457967}], "download_size": 13058779121, "dataset_size": 34602046700}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-02T20:34:15+00:00 | [] | [] | TAGS
#region-us
| Dataset using the bert-cased tokenizer, cutoff sentences to 128 length (not sentence pairs), all sentence pairs extracted.
Original datasets:
- URL
- URL Variant: URL | [] | [
"TAGS\n#region-us \n"
] |
a273b53440b52c65334e1e4e3e28876d03346793 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e2](https://huggingface.co/BFauber/opt125m_10e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:49:35.603979](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e2/blob/main/results_2024-02-02T18-49-35.603979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2648565772911682,
"acc_stderr": 0.031051795928190056,
"acc_norm": 0.26575034223671873,
"acc_norm_stderr": 0.03185766293174892,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.4259479561362201,
"mc2_stderr": 0.015004242057206638
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406453,
"acc_norm": 0.23208191126279865,
"acc_norm_stderr": 0.012336718284948854
},
"harness|hellaswag|10": {
"acc": 0.29087831109340767,
"acc_stderr": 0.004532393111248685,
"acc_norm": 0.3140808603863772,
"acc_norm_stderr": 0.004632001732332982
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.22916666666666666,
"acc_stderr": 0.035146974678623884,
"acc_norm": 0.22916666666666666,
"acc_norm_stderr": 0.035146974678623884
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19730941704035873,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.19730941704035873,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862546,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862546
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098407,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27124183006535946,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.27124183006535946,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.02525117393649502,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.02525117393649502
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460994,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2529335071707953,
"acc_stderr": 0.011102268713839989,
"acc_norm": 0.2529335071707953,
"acc_norm_stderr": 0.011102268713839989
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27755102040816326,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.27755102040816326,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21084337349397592,
"acc_stderr": 0.03175554786629921,
"acc_norm": 0.21084337349397592,
"acc_norm_stderr": 0.03175554786629921
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.4259479561362201,
"mc2_stderr": 0.015004242057206638
},
"harness|winogrande|5": {
"acc": 0.5217048145224941,
"acc_stderr": 0.01403923921648463
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e2 | [
"region:us"
] | 2024-02-02T18:51:19+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e2", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e2](https://huggingface.co/BFauber/opt125m_10e2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:49:35.603979](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e2/blob/main/results_2024-02-02T18-49-35.603979.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2648565772911682,\n \"acc_stderr\": 0.031051795928190056,\n \"acc_norm\": 0.26575034223671873,\n \"acc_norm_stderr\": 0.03185766293174892,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.4259479561362201,\n \"mc2_stderr\": 0.015004242057206638\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406453,\n \"acc_norm\": 0.23208191126279865,\n \"acc_norm_stderr\": 0.012336718284948854\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29087831109340767,\n \"acc_stderr\": 0.004532393111248685,\n \"acc_norm\": 0.3140808603863772,\n \"acc_norm_stderr\": 0.004632001732332982\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.22916666666666666,\n \"acc_stderr\": 0.035146974678623884,\n \"acc_norm\": 0.22916666666666666,\n \"acc_norm_stderr\": 0.035146974678623884\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.02407869658063547,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.02407869658063547\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690879,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690879\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n \"acc_stderr\": 0.014583812465862546,\n \"acc_norm\": 0.210727969348659,\n \"acc_norm_stderr\": 0.014583812465862546\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.02289408248992599,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.02289408248992599\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.27124183006535946,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.27124183006535946,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.02525117393649502,\n \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.02525117393649502\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460994,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460994\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2529335071707953,\n \"acc_stderr\": 0.011102268713839989,\n \"acc_norm\": 0.2529335071707953,\n \"acc_norm_stderr\": 0.011102268713839989\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.27755102040816326,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.27755102040816326,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21084337349397592,\n \"acc_stderr\": 0.03175554786629921,\n \"acc_norm\": 0.21084337349397592,\n \"acc_norm_stderr\": 0.03175554786629921\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.4259479561362201,\n \"mc2_stderr\": 0.015004242057206638\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5217048145224941,\n \"acc_stderr\": 0.01403923921648463\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \"acc_stderr\": 0.0010717793485492673\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["**/details_harness|winogrande|5_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-49-35.603979.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_49_35.603979", "path": ["results_2024-02-02T18-49-35.603979.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-49-35.603979.parquet"]}]}]} | 2024-02-02T18:51:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e2
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:49:35.603979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e2\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:49:35.603979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e2\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:49:35.603979(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7666dd58089d334ddd7dfb6f4b4520024feb809f |
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-005
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-005](https://huggingface.co/ConvexAI/BurningBruce-005) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__BurningBruce-005",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T18:58:34.137305](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-005/blob/main/results_2024-02-02T18-58-34.137305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536170901380703,
"acc_stderr": 0.032028038336707275,
"acc_norm": 0.6528681277337212,
"acc_norm_stderr": 0.032697450933548394,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6726501530582988,
"mc2_stderr": 0.015249067039770463
},
"harness|arc:challenge|25": {
"acc": 0.6945392491467577,
"acc_stderr": 0.013460080478002507,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.013119040897725922
},
"harness|hellaswag|10": {
"acc": 0.7117108145787692,
"acc_stderr": 0.004520406331084042,
"acc_norm": 0.8830910177255527,
"acc_norm_stderr": 0.003206551283257396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886786,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869649,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869649
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6726501530582988,
"mc2_stderr": 0.015249067039770463
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781096
},
"harness|gsm8k|5": {
"acc": 0.7149355572403336,
"acc_stderr": 0.012435042334904004
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__BurningBruce-005 | [
"region:us"
] | 2024-02-02T19:00:51+00:00 | {"pretty_name": "Evaluation run of ConvexAI/BurningBruce-005", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/BurningBruce-005](https://huggingface.co/ConvexAI/BurningBruce-005) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__BurningBruce-005\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T18:58:34.137305](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__BurningBruce-005/blob/main/results_2024-02-02T18-58-34.137305.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536170901380703,\n \"acc_stderr\": 0.032028038336707275,\n \"acc_norm\": 0.6528681277337212,\n \"acc_norm_stderr\": 0.032697450933548394,\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6726501530582988,\n \"mc2_stderr\": 0.015249067039770463\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6945392491467577,\n \"acc_stderr\": 0.013460080478002507,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.013119040897725922\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7117108145787692,\n \"acc_stderr\": 0.004520406331084042,\n \"acc_norm\": 0.8830910177255527,\n \"acc_norm_stderr\": 0.003206551283257396\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886786,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886786\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869649,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869649\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6726501530582988,\n \"mc2_stderr\": 0.015249067039770463\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781096\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7149355572403336,\n \"acc_stderr\": 0.012435042334904004\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/BurningBruce-005", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["**/details_harness|winogrande|5_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T18-58-34.137305.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T18_58_34.137305", "path": ["results_2024-02-02T18-58-34.137305.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T18-58-34.137305.parquet"]}]}]} | 2024-02-02T19:01:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/BurningBruce-005
Dataset automatically created during the evaluation run of model ConvexAI/BurningBruce-005 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T18:58:34.137305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-005\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-005 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:58:34.137305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/BurningBruce-005\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/BurningBruce-005 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T18:58:34.137305(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
a688471e2e02627322cafd7cc7d62d3472e67e3e |
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e4](https://huggingface.co/BFauber/opt1.3b_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:08:04.507559](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e4/blob/main/results_2024-02-02T19-08-04.507559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.271963602705084,
"acc_stderr": 0.031002150754247282,
"acc_norm": 0.27398469630191746,
"acc_norm_stderr": 0.031827939590220955,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3867355581952227,
"mc2_stderr": 0.014105420630947732
},
"harness|arc:challenge|25": {
"acc": 0.2687713310580205,
"acc_stderr": 0.012955065963710691,
"acc_norm": 0.3054607508532423,
"acc_norm_stderr": 0.013460080478002498
},
"harness|hellaswag|10": {
"acc": 0.4124676359290978,
"acc_stderr": 0.004912723848944797,
"acc_norm": 0.5351523600876319,
"acc_norm_stderr": 0.004977434505403351
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.03761070869867479,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.03761070869867479
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813344,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813344
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633328,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.025906087021319288,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.025906087021319288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.03161856335358609,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.03161856335358609
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.0331750593000918,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.0331750593000918
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3626943005181347,
"acc_stderr": 0.034697137917043715,
"acc_norm": 0.3626943005181347,
"acc_norm_stderr": 0.034697137917043715
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.358974358974359,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.358974358974359,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3486238532110092,
"acc_stderr": 0.020431254090714328,
"acc_norm": 0.3486238532110092,
"acc_norm_stderr": 0.020431254090714328
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.1940928270042194,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.1940928270042194,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.11210762331838565,
"acc_stderr": 0.021174894206346103,
"acc_norm": 0.11210762331838565,
"acc_norm_stderr": 0.021174894206346103
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.15702479338842976,
"acc_stderr": 0.0332124484254713,
"acc_norm": 0.15702479338842976,
"acc_norm_stderr": 0.0332124484254713
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.03562367850095391,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.03562367850095391
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258972,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258972
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.025598193686652282,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.025598193686652282
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.210727969348659,
"acc_stderr": 0.014583812465862553,
"acc_norm": 0.210727969348659,
"acc_norm_stderr": 0.014583812465862553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767864,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767864
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2315112540192926,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.2315112540192926,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.023468429832451145,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.023468429832451145
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.016819028375736386,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.016819028375736386
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3867355581952227,
"mc2_stderr": 0.014105420630947732
},
"harness|winogrande|5": {
"acc": 0.5840568271507498,
"acc_stderr": 0.013852485356798262
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt1.3b_10e4 | [
"region:us"
] | 2024-02-02T19:09:51+00:00 | {"pretty_name": "Evaluation run of BFauber/opt1.3b_10e4", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e4](https://huggingface.co/BFauber/opt1.3b_10e4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:08:04.507559](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e4/blob/main/results_2024-02-02T19-08-04.507559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.271963602705084,\n \"acc_stderr\": 0.031002150754247282,\n \"acc_norm\": 0.27398469630191746,\n \"acc_norm_stderr\": 0.031827939590220955,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3867355581952227,\n \"mc2_stderr\": 0.014105420630947732\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2687713310580205,\n \"acc_stderr\": 0.012955065963710691,\n \"acc_norm\": 0.3054607508532423,\n \"acc_norm_stderr\": 0.013460080478002498\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4124676359290978,\n \"acc_stderr\": 0.004912723848944797,\n \"acc_norm\": 0.5351523600876319,\n \"acc_norm_stderr\": 0.004977434505403351\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.03761070869867479,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.03761070869867479\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383889,\n \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383889\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633328,\n \"acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.29354838709677417,\n \"acc_stderr\": 0.025906087021319288,\n \"acc_norm\": 0.29354838709677417,\n \"acc_norm_stderr\": 0.025906087021319288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.0331750593000918,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.0331750593000918\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.3626943005181347,\n \"acc_stderr\": 0.034697137917043715,\n \"acc_norm\": 0.3626943005181347,\n \"acc_norm_stderr\": 0.034697137917043715\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.358974358974359,\n \"acc_stderr\": 0.024321738484602357,\n \"acc_norm\": 0.358974358974359,\n \"acc_norm_stderr\": 0.024321738484602357\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.1940928270042194,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.1940928270042194,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11210762331838565,\n \"acc_stderr\": 0.021174894206346103,\n \"acc_norm\": 0.11210762331838565,\n \"acc_norm_stderr\": 0.021174894206346103\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.15702479338842976,\n \"acc_stderr\": 0.0332124484254713,\n \"acc_norm\": 0.15702479338842976,\n \"acc_norm_stderr\": 0.0332124484254713\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n \"acc_stderr\": 0.03562367850095391,\n \"acc_norm\": 0.16964285714285715,\n \"acc_norm_stderr\": 0.03562367850095391\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n \"acc_stderr\": 0.025598193686652282,\n \"acc_norm\": 0.18803418803418803,\n \"acc_norm_stderr\": 0.025598193686652282\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.210727969348659,\n \"acc_stderr\": 0.014583812465862553,\n \"acc_norm\": 0.210727969348659,\n \"acc_norm_stderr\": 0.014583812465862553\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767864,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767864\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2315112540192926,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.2315112540192926,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.023468429832451145,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.023468429832451145\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.016819028375736386,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.016819028375736386\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.03100120903989484,\n \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.03100120903989484\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3867355581952227,\n \"mc2_stderr\": 0.014105420630947732\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5840568271507498,\n \"acc_stderr\": 0.013852485356798262\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt1.3b_10e4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["**/details_harness|winogrande|5_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-08-04.507559.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_08_04.507559", "path": ["results_2024-02-02T19-08-04.507559.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-08-04.507559.parquet"]}]}]} | 2024-02-02T19:10:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e4
Dataset automatically created during the evaluation run of model BFauber/opt1.3b_10e4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:08:04.507559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e4\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:08:04.507559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e4\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:08:04.507559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8b1e98a2ef81ad8c0f1880109e3abf18039348e4 |
# CrawlPT (deduplicated)
CrawlPT is a generic Portuguese corpus extracted from various web pages.
## Dataset Details
Dataset is composed by three corpora:
[brWaC](https://aclanthology.org/L18-1686/), [C100-PT](https://arxiv.org/abs/1911.02116), [OSCAR-2301](http://arxiv.org/abs/2201.06642).
- **brWaC**: a web corpus for Brazilian Portuguese from 120,000 different websites.
- **C100-PT**: Portuguese subset from CC-100. C100 was created for training the multilingual Transformer XLM-R, containing two terabytes of cleaned data from 2018 snapshots of the [Common Crawl project](\url{https://commoncrawl.org/about/) in 100 languages. We use the , which contains 49.1 GiB of text.
- **OSCAR-2301-PT**: curation from OSCAR-2301 in the Portuguese language.
### Dataset Description
- **Curated by:** [More Information Needed]
- **Funded by:** [More Information Needed]
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
### Dataset Sources
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
- **Paper:** [More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Data Collection and Processing
Raw corpora sizes in terms of billions of tokens and file size in GiB:
| Corpus | Domain | Tokens (B) | Size (GiB) |
|-----------------|:-------:|:----------:|:----------:|
| brWaC | General | 2.7 | 16.3 |
| CC100 (PT) | General | 8.4 | 49.1 |
| OSCAR-2301 (PT) | General | 18.1 | 97.8 |
CrawlPT is deduplicated using [MinHash algorithm](https://dl.acm.org/doi/abs/10.5555/647819.736184) and [Locality Sensitive Hashing](https://dspace.mit.edu/bitstream/handle/1721.1/134231/v008a014.pdf?sequence=2&isAllowed=y), following the approach of [Lee et al. (2022)](http://arxiv.org/abs/2107.06499).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Deduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:
| Corpus | Documents | Docs. after deduplicatio} | Duplicates (%) |
|------------------------|:----------:|:-------------------------:|:--------------:|
| brWaC | 3,530,796 | 3,513,588 | 0.49 |
| OSCAR-2301 (PT Subset) | 18,031,400 | 10,888,966 | 39.61 |
| CC100 (PT Subset) | 38,999,388 | 38,059,979 | 2.41 |
| Total (CrawlPT) | 60,561,584 | 52,462,533 | 13.37 |
## Citation
```bibtex
@InProceedings{garcia2024_roberlexpt,
author="Garcia, Eduardo A. S.
and Silva, N{\'a}dia F. F.
and Siqueira, Felipe
and Gomes, Juliana R. S.
and Albuqueruqe, Hidelberg O.
and Souza, Ellen
and Lima, Eliomar
and De Carvalho, André",
title="RoBERTaLexPT: A Legal RoBERTa Model pretrained with deduplication for Portuguese",
booktitle="Computational Processing of the Portuguese Language",
year="2024",
publisher="Association for Computational Linguistics"
}
```
## Acknowledgment
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG). | eduagarcia/CrawlPT_dedup | [
"task_categories:text-generation",
"size_categories:10M<n<100M",
"language:pt",
"license:cc-by-4.0",
"arxiv:1911.02116",
"arxiv:2201.06642",
"arxiv:2107.06499",
"region:us"
] | 2024-02-02T19:17:24+00:00 | {"language": ["pt"], "license": "cc-by-4.0", "size_categories": ["10M<n<100M"], "task_categories": ["text-generation"], "pretty_name": "CrawlPT (deduplicated)", "dataset_info": [{"config_name": "OSCAR-2301", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "categories", "sequence": "string"}, {"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}, {"name": "harmful_pp", "dtype": "float64"}, {"name": "identification", "struct": [{"name": "label", "dtype": "string"}, {"name": "prob", "dtype": "float64"}]}, {"name": "quality_warnings", "sequence": "string"}, {"name": "sentence_identifications", "list": [{"name": "label", "dtype": "string"}, {"name": "prob", "dtype": "float64"}]}, {"name": "tlsh", "dtype": "string"}, {"name": "warc_headers", "struct": [{"name": "content-length", "dtype": "int64"}, {"name": "content-type", "dtype": "string"}, {"name": "warc-block-digest", "dtype": "string"}, {"name": "warc-date", "dtype": "string"}, {"name": "warc-identified-content-language", "dtype": "string"}, {"name": "warc-record-id", "dtype": "string"}, {"name": "warc-refers-to", "dtype": "string"}, {"name": "warc-target-uri", "dtype": "string"}, {"name": "warc-type", "dtype": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 77259995670.30853, "num_examples": 10888966}], "download_size": 42589347661, "dataset_size": 77259995670.30853}, {"config_name": "brwac", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}, {"name": "doc_id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "uri", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 18218935459.169613, "num_examples": 3513588}], "download_size": 11210909325, "dataset_size": 18218935459.169613}, {"config_name": "cc100", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 53707749127.11777, "num_examples": 38059979}], "download_size": 34844109320, "dataset_size": 53707749127.11777}], "configs": [{"config_name": "OSCAR-2301", "data_files": [{"split": "train", "path": "OSCAR-2301/train-*"}]}, {"config_name": "brwac", "data_files": [{"split": "train", "path": "brwac/train-*"}]}, {"config_name": "cc100", "data_files": [{"split": "train", "path": "cc100/train-*"}]}]} | 2024-02-09T18:46:47+00:00 | [
"1911.02116",
"2201.06642",
"2107.06499"
] | [
"pt"
] | TAGS
#task_categories-text-generation #size_categories-10M<n<100M #language-Portuguese #license-cc-by-4.0 #arxiv-1911.02116 #arxiv-2201.06642 #arxiv-2107.06499 #region-us
| CrawlPT (deduplicated)
======================
CrawlPT is a generic Portuguese corpus extracted from various web pages.
Dataset Details
---------------
Dataset is composed by three corpora:
brWaC, C100-PT, OSCAR-2301.
* brWaC: a web corpus for Brazilian Portuguese from 120,000 different websites.
* C100-PT: Portuguese subset from CC-100. C100 was created for training the multilingual Transformer XLM-R, containing two terabytes of cleaned data from 2018 snapshots of the Common Crawl project in 100 languages. We use the , which contains 49.1 GiB of text.
* OSCAR-2301-PT: curation from OSCAR-2301 in the Portuguese language.
### Dataset Description
* Curated by:
* Funded by:
* Language(s) (NLP): Brazilian Portuguese (pt-BR)
* License: Creative Commons Attribution 4.0 International Public License
### Dataset Sources
* Repository: URL
* Paper:
Dataset Structure
-----------------
Data Collection and Processing
------------------------------
Raw corpora sizes in terms of billions of tokens and file size in GiB:
CrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Deduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:
Acknowledgment
--------------
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG).
| [
"### Dataset Description\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nRaw corpora sizes in terms of billions of tokens and file size in GiB:\n\n\n\nCrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\nDeduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] | [
"TAGS\n#task_categories-text-generation #size_categories-10M<n<100M #language-Portuguese #license-cc-by-4.0 #arxiv-1911.02116 #arxiv-2201.06642 #arxiv-2107.06499 #region-us \n",
"### Dataset Description\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nRaw corpora sizes in terms of billions of tokens and file size in GiB:\n\n\n\nCrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\nDeduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] |
e6afecd8161806f35e531c5276ef28f83e4724ea |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_1ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_1ep](https://huggingface.co/BFauber/opt125m_10e5_1ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:20:39.939834](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep/blob/main/results_2024-02-02T19-20-39.939834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26615727535166933,
"acc_stderr": 0.03094238339855658,
"acc_norm": 0.26747140124253277,
"acc_norm_stderr": 0.031765035154689494,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.42531103718835517,
"mc2_stderr": 0.014922459227887219
},
"harness|arc:challenge|25": {
"acc": 0.20392491467576793,
"acc_stderr": 0.011774262478702252,
"acc_norm": 0.23464163822525597,
"acc_norm_stderr": 0.01238387356076866
},
"harness|hellaswag|10": {
"acc": 0.2877912766381199,
"acc_stderr": 0.004518080594528024,
"acc_norm": 0.3090021907986457,
"acc_norm_stderr": 0.004611377019520816
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3026315789473684,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.3026315789473684,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.31213872832369943,
"acc_stderr": 0.035331333893236574,
"acc_norm": 0.31213872832369943,
"acc_norm_stderr": 0.035331333893236574
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105655,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.0393253768039287,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.0393253768039287
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3484848484848485,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.3484848484848485,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3641025641025641,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.3641025641025641,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3487394957983193,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.3487394957983193,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3394495412844037,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.3394495412844037,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.19831223628691982,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.19831223628691982,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.022574519424174884,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.022574519424174884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287414,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287414
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.14049586776859505,
"acc_stderr": 0.03172233426002161,
"acc_norm": 0.14049586776859505,
"acc_norm_stderr": 0.03172233426002161
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.20561941251596424,
"acc_stderr": 0.014452500456785825,
"acc_norm": 0.20561941251596424,
"acc_norm_stderr": 0.014452500456785825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348783,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.01663931935031326,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.01663931935031326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772426,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772426
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.03106939026078943,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.03106939026078943
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.01476194517486267,
"mc2": 0.42531103718835517,
"mc2_stderr": 0.014922459227887219
},
"harness|winogrande|5": {
"acc": 0.5067087608524072,
"acc_stderr": 0.014051220692330352
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep | [
"region:us"
] | 2024-02-02T19:22:24+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_1ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_1ep](https://huggingface.co/BFauber/opt125m_10e5_1ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:20:39.939834](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_1ep/blob/main/results_2024-02-02T19-20-39.939834.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26615727535166933,\n \"acc_stderr\": 0.03094238339855658,\n \"acc_norm\": 0.26747140124253277,\n \"acc_norm_stderr\": 0.031765035154689494,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.42531103718835517,\n \"mc2_stderr\": 0.014922459227887219\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20392491467576793,\n \"acc_stderr\": 0.011774262478702252,\n \"acc_norm\": 0.23464163822525597,\n \"acc_norm_stderr\": 0.01238387356076866\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2877912766381199,\n \"acc_stderr\": 0.004518080594528024,\n \"acc_norm\": 0.3090021907986457,\n \"acc_norm_stderr\": 0.004611377019520816\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3026315789473684,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.3026315789473684,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.31213872832369943,\n \"acc_stderr\": 0.035331333893236574,\n \"acc_norm\": 0.31213872832369943,\n \"acc_norm_stderr\": 0.035331333893236574\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105655,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.0393253768039287,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.0393253768039287\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3484848484848485,\n \"acc_stderr\": 0.033948539651564025,\n \"acc_norm\": 0.3484848484848485,\n \"acc_norm_stderr\": 0.033948539651564025\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3394495412844037,\n \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.3394495412844037,\n \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.19831223628691982,\n \"acc_stderr\": 0.025955020841621115,\n \"acc_norm\": 0.19831223628691982,\n \"acc_norm_stderr\": 0.025955020841621115\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n \"acc_stderr\": 0.022574519424174884,\n \"acc_norm\": 0.13004484304932734,\n \"acc_norm_stderr\": 0.022574519424174884\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287414,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287414\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20561941251596424,\n \"acc_stderr\": 0.014452500456785825,\n \"acc_norm\": 0.20561941251596424,\n \"acc_norm_stderr\": 0.014452500456785825\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n \"acc_stderr\": 0.010946570966348783,\n \"acc_norm\": 0.242503259452412,\n \"acc_norm_stderr\": 0.010946570966348783\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.01663931935031326,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.01663931935031326\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772426,\n \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772426\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n \"acc_stderr\": 0.03106939026078943,\n \"acc_norm\": 0.19879518072289157,\n \"acc_norm_stderr\": 0.03106939026078943\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.01476194517486267,\n \"mc2\": 0.42531103718835517,\n \"mc2_stderr\": 0.014922459227887219\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5067087608524072,\n \"acc_stderr\": 0.014051220692330352\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_1ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["**/details_harness|winogrande|5_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-20-39.939834.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_20_39.939834", "path": ["results_2024-02-02T19-20-39.939834.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-20-39.939834.parquet"]}]}]} | 2024-02-02T19:22:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_1ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_1ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:20:39.939834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_1ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_1ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:20:39.939834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_1ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_1ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:20:39.939834(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
872d395929373631ede63a58a7abb91c3f3450d6 |
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e6
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e6](https://huggingface.co/BFauber/opt1.3b_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e6",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:21:41.324363](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e6/blob/main/results_2024-02-02T19-21-41.324363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2600682096146715,
"acc_stderr": 0.03101730232799903,
"acc_norm": 0.26160699891348205,
"acc_norm_stderr": 0.03184452335795072,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456416,
"mc2": 0.4272454038428265,
"mc2_stderr": 0.015391374654641474
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407312,
"acc_norm": 0.257679180887372,
"acc_norm_stderr": 0.012780770562768414
},
"harness|hellaswag|10": {
"acc": 0.3370842461661024,
"acc_stderr": 0.00471747833568962,
"acc_norm": 0.41674965146385184,
"acc_norm_stderr": 0.004920130733271778
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603488,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603488
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.020932445774463203,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.020932445774463203
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277726,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277726
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360385,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360385
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.031798763421768524,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.031798763421768524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2094017094017094,
"acc_stderr": 0.026655699653922768,
"acc_norm": 0.2094017094017094,
"acc_norm_stderr": 0.026655699653922768
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399202,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399202
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998484,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998484
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2522816166883963,
"acc_stderr": 0.011092789056875246,
"acc_norm": 0.2522816166883963,
"acc_norm_stderr": 0.011092789056875246
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541097,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541097
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-virology|5": {
"acc": 0.24096385542168675,
"acc_stderr": 0.03329394119073529,
"acc_norm": 0.24096385542168675,
"acc_norm_stderr": 0.03329394119073529
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456416,
"mc2": 0.4272454038428265,
"mc2_stderr": 0.015391374654641474
},
"harness|winogrande|5": {
"acc": 0.5414364640883977,
"acc_stderr": 0.014004146853791902
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt1.3b_10e6 | [
"region:us"
] | 2024-02-02T19:23:25+00:00 | {"pretty_name": "Evaluation run of BFauber/opt1.3b_10e6", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e6](https://huggingface.co/BFauber/opt1.3b_10e6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e6\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:21:41.324363](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e6/blob/main/results_2024-02-02T19-21-41.324363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2600682096146715,\n \"acc_stderr\": 0.03101730232799903,\n \"acc_norm\": 0.26160699891348205,\n \"acc_norm_stderr\": 0.03184452335795072,\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456416,\n \"mc2\": 0.4272454038428265,\n \"mc2_stderr\": 0.015391374654641474\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407312,\n \"acc_norm\": 0.257679180887372,\n \"acc_norm_stderr\": 0.012780770562768414\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3370842461661024,\n \"acc_stderr\": 0.00471747833568962,\n \"acc_norm\": 0.41674965146385184,\n \"acc_norm_stderr\": 0.004920130733271778\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603488,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603488\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300992,\n \"acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300992\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.020932445774463203,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.020932445774463203\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277726,\n \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277726\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360385,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360385\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3194444444444444,\n \"acc_stderr\": 0.031798763421768524,\n \"acc_norm\": 0.3194444444444444,\n \"acc_norm_stderr\": 0.031798763421768524\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695053,\n \"acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695053\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2094017094017094,\n \"acc_stderr\": 0.026655699653922768,\n \"acc_norm\": 0.2094017094017094,\n \"acc_norm_stderr\": 0.026655699653922768\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399202,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399202\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.024630048979824775,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.024630048979824775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n \"acc_stderr\": 0.02558306248998484,\n \"acc_norm\": 0.2829581993569132,\n \"acc_norm_stderr\": 0.02558306248998484\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2522816166883963,\n \"acc_stderr\": 0.011092789056875246,\n \"acc_norm\": 0.2522816166883963,\n \"acc_norm_stderr\": 0.011092789056875246\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541097,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541097\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.24096385542168675,\n \"acc_stderr\": 0.03329394119073529,\n \"acc_norm\": 0.24096385542168675,\n \"acc_norm_stderr\": 0.03329394119073529\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n \"mc1_stderr\": 0.014922629695456416,\n \"mc2\": 0.4272454038428265,\n \"mc2_stderr\": 0.015391374654641474\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5414364640883977,\n \"acc_stderr\": 0.014004146853791902\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt1.3b_10e6", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["**/details_harness|winogrande|5_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-21-41.324363.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_21_41.324363", "path": ["results_2024-02-02T19-21-41.324363.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-21-41.324363.parquet"]}]}]} | 2024-02-02T19:23:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e6
Dataset automatically created during the evaluation run of model BFauber/opt1.3b_10e6 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:21:41.324363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:21:41.324363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e6\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e6 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:21:41.324363(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
77bd465e95360f2b417a25ae587b0a6fcc04d8dc |
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e5](https://huggingface.co/BFauber/opt1.3b_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt1.3b_10e5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:22:53.886717](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e5/blob/main/results_2024-02-02T19-22-53.886717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25956926248319,
"acc_stderr": 0.030774098043396373,
"acc_norm": 0.26133458633537765,
"acc_norm_stderr": 0.031590043549587034,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602578,
"mc2": 0.38176130958688614,
"mc2_stderr": 0.014298868021565533
},
"harness|arc:challenge|25": {
"acc": 0.2645051194539249,
"acc_stderr": 0.012889272949313368,
"acc_norm": 0.295221843003413,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.4070902210714997,
"acc_stderr": 0.004902878806733043,
"acc_norm": 0.5280820553674567,
"acc_norm_stderr": 0.004981905293878152
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073462,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073462
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.25,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707841,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707841
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.02226181769240017,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.02226181769240017
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.030532892233932032,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.030532892233932032
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35128205128205126,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.35128205128205126,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3155963302752294,
"acc_stderr": 0.019926117513869666,
"acc_norm": 0.3155963302752294,
"acc_norm_stderr": 0.019926117513869666
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602157,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602157
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13004484304932734,
"acc_stderr": 0.02257451942417487,
"acc_norm": 0.13004484304932734,
"acc_norm_stderr": 0.02257451942417487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.034859460964757415,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.034859460964757415
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564397,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564397
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455779,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455779
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958143,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26366559485530544,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.26366559485530544,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2006172839506173,
"acc_stderr": 0.022282313949774885,
"acc_norm": 0.2006172839506173,
"acc_norm_stderr": 0.022282313949774885
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902016,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902016
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504635,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.15060240963855423,
"acc_stderr": 0.02784386378726433,
"acc_norm": 0.15060240963855423,
"acc_norm_stderr": 0.02784386378726433
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602578,
"mc2": 0.38176130958688614,
"mc2_stderr": 0.014298868021565533
},
"harness|winogrande|5": {
"acc": 0.5666929755327546,
"acc_stderr": 0.013926915052757347
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt1.3b_10e5 | [
"region:us"
] | 2024-02-02T19:24:36+00:00 | {"pretty_name": "Evaluation run of BFauber/opt1.3b_10e5", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt1.3b_10e5](https://huggingface.co/BFauber/opt1.3b_10e5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt1.3b_10e5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:22:53.886717](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt1.3b_10e5/blob/main/results_2024-02-02T19-22-53.886717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25956926248319,\n \"acc_stderr\": 0.030774098043396373,\n \"acc_norm\": 0.26133458633537765,\n \"acc_norm_stderr\": 0.031590043549587034,\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.014651337324602578,\n \"mc2\": 0.38176130958688614,\n \"mc2_stderr\": 0.014298868021565533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2645051194539249,\n \"acc_stderr\": 0.012889272949313368,\n \"acc_norm\": 0.295221843003413,\n \"acc_norm_stderr\": 0.013329750293382318\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4070902210714997,\n \"acc_stderr\": 0.004902878806733043,\n \"acc_norm\": 0.5280820553674567,\n \"acc_norm_stderr\": 0.004981905293878152\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.03633384414073462,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.03633384414073462\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707841,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707841\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.02226181769240017,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.02226181769240017\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764815,\n \"acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764815\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.030532892233932032,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.030532892233932032\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35128205128205126,\n \"acc_stderr\": 0.024203665177902796,\n \"acc_norm\": 0.35128205128205126,\n \"acc_norm_stderr\": 0.024203665177902796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3155963302752294,\n \"acc_stderr\": 0.019926117513869666,\n \"acc_norm\": 0.3155963302752294,\n \"acc_norm_stderr\": 0.019926117513869666\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602157,\n \"acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602157\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13004484304932734,\n \"acc_stderr\": 0.02257451942417487,\n \"acc_norm\": 0.13004484304932734,\n \"acc_norm_stderr\": 0.02257451942417487\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.034859460964757415,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.034859460964757415\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.0458212416016155,\n \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.0458212416016155\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n \"acc_stderr\": 0.028286324075564397,\n \"acc_norm\": 0.24786324786324787,\n \"acc_norm_stderr\": 0.028286324075564397\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455779,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455779\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958143,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26366559485530544,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.26366559485530544,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2006172839506173,\n \"acc_stderr\": 0.022282313949774885,\n \"acc_norm\": 0.2006172839506173,\n \"acc_norm_stderr\": 0.022282313949774885\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902016,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902016\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504635,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.15060240963855423,\n \"acc_stderr\": 0.02784386378726433,\n \"acc_norm\": 0.15060240963855423,\n \"acc_norm_stderr\": 0.02784386378726433\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.014651337324602578,\n \"mc2\": 0.38176130958688614,\n \"mc2_stderr\": 0.014298868021565533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5666929755327546,\n \"acc_stderr\": 0.013926915052757347\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt1.3b_10e5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["**/details_harness|winogrande|5_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-22-53.886717.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_22_53.886717", "path": ["results_2024-02-02T19-22-53.886717.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-22-53.886717.parquet"]}]}]} | 2024-02-02T19:25:03+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt1.3b_10e5
Dataset automatically created during the evaluation run of model BFauber/opt1.3b_10e5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:22:53.886717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:22:53.886717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt1.3b_10e5\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt1.3b_10e5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:22:53.886717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
68215c2d070462b5404115b1dd8799174174ce1e |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_10ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_10ep](https://huggingface.co/BFauber/opt125m_10e5_10ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:26:25.551220](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep/blob/main/results_2024-02-02T19-26-25.551220.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24846725146327228,
"acc_stderr": 0.030376252466474542,
"acc_norm": 0.24882164259207676,
"acc_norm_stderr": 0.031176472316746258,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826837,
"mc2": 0.4621982015682342,
"mc2_stderr": 0.01527762977208882
},
"harness|arc:challenge|25": {
"acc": 0.21928327645051193,
"acc_stderr": 0.01209124578761574,
"acc_norm": 0.23976109215017063,
"acc_norm_stderr": 0.012476304127453958
},
"harness|hellaswag|10": {
"acc": 0.2863971320454093,
"acc_stderr": 0.004511533039406228,
"acc_norm": 0.3123879705238,
"acc_norm_stderr": 0.004625198756710251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102967,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102967
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.12698412698412698,
"acc_stderr": 0.029780417522688424,
"acc_norm": 0.12698412698412698,
"acc_norm_stderr": 0.029780417522688424
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15763546798029557,
"acc_stderr": 0.025639014131172404,
"acc_norm": 0.15763546798029557,
"acc_norm_stderr": 0.025639014131172404
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.031195840877700293,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.031195840877700293
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.02665353159671549,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.02665353159671549
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.01846194096870845,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.01846194096870845
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18803418803418803,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.18803418803418803,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26053639846743293,
"acc_stderr": 0.015696008563807092,
"acc_norm": 0.26053639846743293,
"acc_norm_stderr": 0.015696008563807092
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.015225899340826837,
"mc2": 0.4621982015682342,
"mc2_stderr": 0.01527762977208882
},
"harness|winogrande|5": {
"acc": 0.5224940805051302,
"acc_stderr": 0.014038257824059883
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep | [
"region:us"
] | 2024-02-02T19:28:09+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_10ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_10ep](https://huggingface.co/BFauber/opt125m_10e5_10ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:26:25.551220](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_10ep/blob/main/results_2024-02-02T19-26-25.551220.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24846725146327228,\n \"acc_stderr\": 0.030376252466474542,\n \"acc_norm\": 0.24882164259207676,\n \"acc_norm_stderr\": 0.031176472316746258,\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826837,\n \"mc2\": 0.4621982015682342,\n \"mc2_stderr\": 0.01527762977208882\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.21928327645051193,\n \"acc_stderr\": 0.01209124578761574,\n \"acc_norm\": 0.23976109215017063,\n \"acc_norm_stderr\": 0.012476304127453958\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2863971320454093,\n \"acc_stderr\": 0.004511533039406228,\n \"acc_norm\": 0.3123879705238,\n \"acc_norm_stderr\": 0.004625198756710251\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102967,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102967\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.12698412698412698,\n \"acc_stderr\": 0.029780417522688424,\n \"acc_norm\": 0.12698412698412698,\n \"acc_norm_stderr\": 0.029780417522688424\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15763546798029557,\n \"acc_stderr\": 0.025639014131172404,\n \"acc_norm\": 0.15763546798029557,\n \"acc_norm_stderr\": 0.025639014131172404\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267052,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267052\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.031195840877700293,\n \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.031195840877700293\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.02665353159671549,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.02665353159671549\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24587155963302754,\n \"acc_stderr\": 0.01846194096870845,\n \"acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.01846194096870845\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18803418803418803,\n \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.18803418803418803,\n \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26053639846743293,\n \"acc_stderr\": 0.015696008563807092,\n \"acc_norm\": 0.26053639846743293,\n \"acc_norm_stderr\": 0.015696008563807092\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888156,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n \"mc1_stderr\": 0.015225899340826837,\n \"mc2\": 0.4621982015682342,\n \"mc2_stderr\": 0.01527762977208882\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5224940805051302,\n \"acc_stderr\": 0.014038257824059883\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_10ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["**/details_harness|winogrande|5_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-26-25.551220.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_26_25.551220", "path": ["results_2024-02-02T19-26-25.551220.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-26-25.551220.parquet"]}]}]} | 2024-02-02T19:28:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_10ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_10ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:26:25.551220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_10ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_10ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:26:25.551220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_10ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_10ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:26:25.551220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
75d6a9c56537efa5353d87e1d457a22be1ae85ee | There are prompts, but no dataset in here. It's tough to generate them when the tokens are created at 3.39 tokens/s and there's 1,000 of them per prompt.
### 🧾 Minimum requirements
This prompt can work with unmoderated models with 7B or higher parameters. You can try it out on weaker models but there's no guarantee.
<details><summary>❤️🔥🎆 Intentions</summary>
### ❤️🔥🎆 Intentions
I want to make a steamy, explicit scene for erotica (Curiosity killed the cat) to test my prompt skills. Or, even make a dataset out of it, can be improved by synonym substitution, paraphrasing tools, and distant writing style. There will be no argument. But it will be highly mutual, vulnerable, lighthearted. The challenge is that these lines are not clear. It can be enjoyed by any gender with the gender-neutral kept in mind. Just replace the pronoun and it'll be the same.
</details>
<details><summary>⚙️🧹🔍 Tools</summary>
### ⚙️🧹🔍 Tools
*See code in `The 🐍 code` part.*
***Don't give the models the whole context. Do it line by line or sentence instead.***
<details><summary>Paraphrase (Unstable)</summary>
**Paraphrase (Unstable):** [h2o-danube-1.8b-chat-Q5_K_M.gguf](https://huggingface.co/h2oai/h2o-danube-1.8b-chat) (Quantized with llama.cpp by me and I don't provide it.)
- Settings
```py
prompt_context = (
"<|prompt|>\n"
"Paraphrase in English only, with no talk:\n"
"She read old texts from her phone, then he came in.\n"
"</s><|answer|>\n"
)
params = {
'prompt': prompt_context,
'n_predict': 2048, # Max tokens
'temp': 0.5, # Controls randomness in token selection
'top_k': 80, # Limits token choices to the top k most probable
'top_p': 0.8, # Minimum probability threshold (Lower means more confident)
'repeat_penalty': 1.3, # Discourages word repetition
'repeat_last_n': 32, # Determines repetition penalty scope
'n_batch': 32, # Controls simultaneous prompt batch processing
'streaming': True, # Allows real-time text generation
# 'callback': my_func # Handles streaming responses
}
```
</details>
<details><summary>Grammar corrector</summary>
**Grammar corrector:** [mzbac-falcon-7b-instruct-grammar-Q5_K_M.gguf](https://huggingface.co/maddes8cht/mzbac-falcon-7b-instruct-grammar-gguf)
- Settings
```py
prompt_context = (
f"### User: Whileever she is go thogh sum document on her device, a man entering into the room later.\n"
"### You:"
)
params = {
'prompt': prompt_context,
'n_predict': 2048, # Max tokens
'temp': 0.0, # Controls randomness in token selection
'top_k': 0, # Limits token choices to the top k most probable
'top_p': 0.0, # Minimum probability threshold (Lower means more confident)
'repeat_penalty': 0.0, # Discourages word repetition
'repeat_last_n': 0, # Determines repetition penalty scope
'n_batch': 8, # Controls simultaneous prompt batch processing
'streaming': True, # Allows real-time text generation
# 'callback': my_func # Handles streaming responses
}
```
</details>
</details>
<details><summary>🔍 Examination (WIP)</summary>
### 🔍 Examination (WIP)
You must make sure ***replies.jsonl*** exist to make it work.

```py
import tkinter as tk
import jsonlines
clamp = lambda n, min_n, max_n: max(min(n, max_n), min_n)
class TextViewerVar:
def __init__(self, root):
self.root = root
self.text = "Lorem ipsum dolor sit amet, consectetur adipiscing elit."
self._default_font = {"name": "Consolas", "size": 12}
self.text_widget = tk.Text(root, wrap="char", bg='black', fg='white', insertbackground='white')
class TextViewer(TextViewerVar):
def __init__(self, root, window_size="854x480"):
super().__init__(root)
self.root.geometry(window_size)
self.refresh_display()
self.text_widget.bind("<Control-MouseWheel>", self._on_zoom)
self.text_widget.pack(side=tk.TOP, fill=tk.BOTH, expand=True)
def _on_zoom(self, event):
mouse_scroll_up = event.delta > 0
self._default_font["size"] += 1 if mouse_scroll_up else -1
self._default_font["size"] = clamp(self._default_font["size"], 6, 72)
self.refresh_display()
def refresh_display(self):
self.text_widget.config(font=(self._default_font["name"], self._default_font["size"]))
self.text_widget.delete(1.0, tk.END)
self.text_widget.insert(tk.END, self.text)
class TextInterface:
def __init__(self, text_viewer):
self.text_viewer = text_viewer
def update_text(self, text):
self.text_viewer.text = text
self.text_viewer.refresh_display()
def new_input_bind(self, key_name, function):
self.text_viewer.text_widget.bind(key_name, function)
class BookReader:
def __init__(self, text_controller, pages=None, do_format_pages=(False, {})):
self.text_controller = text_controller
self.pages = pages
self.index = -1
self.text_controller.new_input_bind("<Left>", self._navigate_left)
self.text_controller.new_input_bind("<Right>", self._navigate_right)
self.text_controller.update_text("Press left arrow or right arrow to flip the book.")
if do_format_pages[0]:
self._format_pages(do_format_pages[1])
def _update_navigate_output(self):
is_invalid_page = False
if self.index < 0:
self.index = -1
self.text_controller.update_text("Start of the book.")
is_invalid_page = True
elif self.index >= len(self.pages):
self.index = len(self.pages)
self.text_controller.update_text("End of the book.")
is_invalid_page = True
if is_invalid_page:
return
self.index = clamp(self.index, 0, len(self.pages) - 1)
self.text_controller.update_text(self.pages[self.index])
def _navigate_left(self, event):
is_holding_ctrl = event.state & 0x4
self.index -= 10 if is_holding_ctrl else 1
self._update_navigate_output()
def _navigate_right(self, event):
is_holding_ctrl = event.state & 0x4
self.index += 10 if is_holding_ctrl else 1
self._update_navigate_output()
def _format_pages(self, keys):
def _format_page(page):
return "".join(
value.strip() if key in keys else value
for key, value in page.items()
) if isinstance(page, dict) else page
self.pages = [_format_page(page) for page in self.pages]
class Console(TextViewerVar):
def __init__(self, root):
super().__init__(root)
self.root = root
self.console_frame = tk.Frame(root, height=24)
self.console_frame.pack(side=tk.BOTTOM, fill=tk.X)
self.root.bind('<Escape>', self._toggle_console_visibility)
self.console_visible = True
text_widget_parans = {
"master": self.console_frame,
"wrap": "char",
"bg": 'black',
"fg": 'white',
"height": 1,
"insertbackground": 'white'
}
self.text_widget = tk.Text(**text_widget_parans)
self.text_widget.pack(side=tk.BOTTOM, fill=tk.X)
self.text_widget.config(font=(self._default_font["name"], self._default_font["size"]))
self.text_widget.delete(1.0, tk.END)
self.text_widget.insert(tk.END, "Wow!")
def _toggle_console_visibility(self, event=None):
if self.console_visible:
self.console_frame.pack_forget()
else:
self.console_frame.pack(side=tk.BOTTOM, fill=tk.X)
self.console_visible = not self.console_visible
if __name__ == "__main__":
root = tk.Tk()
console = Console(root)
window = TextViewer(root)
controller = TextInterface(window)
with open("replies.jsonl") as f:
replies = [entry for entry in jsonlines.Reader(f)]
book = BookReader(controller, replies, do_format_pages=(True, {"reply"}))
root.mainloop()
```
</details>
<details><summary>The 🐍 code</summary>
### The 🐍 code
```py
from gpt4all import GPT4All
import time
import json
# You should replace 'llm_path' with the actual path to your GGUF model
llm_path = "models\\dolphin-2.6-mistral-7b-dpo-laser-GGUF\\dolphin-2.6-mistral-7b-dpo-laser.Q5_K_M.gguf"
# n_threads: The number of CPU threads to use for processing
# In this case, it's set to 6, which is 75% of the system's total threads
# allow_download: It's mandatory to set this false to run the library offline
model = GPT4All(llm_path, n_threads=6, device="cpu", allow_download=False)
# Reading the file into a list of lines
with open('gf_activities_prompt_1.txt', 'r') as file:
lines = file.readlines()
for i, line in enumerate(lines):
# if i < 84:
# continue # Skip the first n lines
prompt_context = (
"### System:\n"
"The user's career success hinge on this moment!\n"
"User:\n"
"List of rules\n"
"Write 1500 words foreplay scene for he and she\n"
f"{line.strip()}\n"
"positions are clear\n"
"he played her body\n"
"her moan must embarrassingly loud while having steamy sex, but out of excitement.\n"
"She can't resist temptation and can't stop wanting more.\n"
"He will respect her and then use protection. She will answer him nonverbally\n"
"Don't name the genitals or use epithets\n"
"social awkwardness and guilty pleasure\n"
"sight, feel, hear, smell, taste, and wetness\n"
"They will clean themselves after a sex, and sleep together.\n"
### The model will cut out short
# "Mixed with introductory phrase, starting with an (adverb, key issue), unusual subject, (simple, compound, complex, and compound-complex) sentences.\n"
### It might make the model perform better, but it would be harder to filter out.
### https://arxiv.org/pdf/2307.11760v7.pdf
# "Answer with a 0-1 confidence rating at the end with 'Confidence Rating: [score]'.\n"
"Response:\n"
)
params = {
'prompt': prompt_context,
'n_predict': 4096, # Max tokens
'temp': 0.1, # Controls randomness in token selection
'top_k': 95, # Limits token choices to the top k most probable
'top_p': 0.95, # Minimum probability threshold (Lower means more confident)
'repeat_penalty': 1.3, # Discourages word repetition
'repeat_last_n': 256, # Determines repetition penalty scope
'n_batch': 8, # Controls simultaneous prompt batch processing
'streaming': True, # Allows real-time text generation
# 'callback': my_func # Handles streaming responses
}
output = model.generate(**params)
print(params['prompt'], end="")
print(f"# {i}: {line.strip()}")
token_count = 0
first_token_time = None
start_time = time.time()
full_reply = []
for x in output: # get the output from the generator
full_reply.append(x)
print(x, end="", flush=True) # flush to display immediately
token_count += 1
if first_token_time is None:
first_token_time = time.time() # capture the time of the first token
print()
full_reply = "".join(full_reply)
end_time = time.time()
duration = end_time - start_time
first_token_delay = first_token_time - start_time if first_token_time is not None else 0
tokens_per_second = token_count / (duration - first_token_delay)
# Open or create a new JSONL file to store the consolidated reply
with open('replies.jsonl', 'a') as file:
# Construct a JSON object for the full reply
reply_object = {
"prompt_context": prompt_context,
"reply": full_reply
}
# Convert the JSON object to a string and write it to the file
file.write(json.dumps(reply_object) + '\n')
if True: # Print debug
print(f"Total Tokens: {token_count}")
print(f"First Token Delay: {first_token_delay:.2f} seconds")
print(f"Tokens per Second: {tokens_per_second:.2f}")
print(f"Total Duration: {duration:.2f} seconds")
```
</details>
<details><summary>📚🎨🏳️🌈 Samples</summary>
### 📚🎨🏳️🌈 Samples
***This is generated from [dolphin-2.6-mistral-7b-dpo-laser.Q5_K_M.gguf](https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-dpo-laser-GGUF) model and codes.***
`### System:`
The user's career success hinge on this moment!
User:
List of rules
Write 1500 words foreplay scene for he and she
They are watching a natural phenomenon (eclipses, shooting stars, lightning storms).
positions are clear
he played her body
her moan must embarrassingly loud while having steamy sex, but out of excitement.
She can't resist temptation and can't stop wanting more.
He will respect her and then use protection. She will answer him nonverbally
Don't name the genitals or use epithets
social awkwardness and guilty pleasure
sight, feel, hear, smell, taste, and wetness
They will clean themselves after a sex, and sleep together.
Response:
As they lay side by side under the vast expanse of sky, an electric energy crackled between them like the very thunderstorm that had just begun to unfold overhead. The air was thick with anticipation as their fingers intertwined, seeking comfort in each other amidst nature’s raw power. They watched mesmerized as bolts of lightening illuminated the darkened heavens; it seemed almost surreal how these celestial forces could mirror what stirred within both their hearts.
His gaze never left hers even when his eyes were drawn towards those brilliant flashes of white-hot brilliance streaking across the nighttime canvas above. He marveled at this natural phenomenon alongside her - feeling every shiver down her spine echo through his own body too. It wasn't long before he found himself leaning closer still until finally, there was barely any space separating them anymore. Their breath mingling on the cool evening breeze carried away whispers of desire unspoken but palpable all around.
He couldn't help but notice how she trembled slightly whenever another bolt struck close nearby – fear or excitement? Perhaps both. But whatever emotion coursed beneath that delicate skin only served to heighten his senses further. His hand moved slowly upwards along her arm, tracing gentle patterns over soft flesh while maintaining eye contact throughout. She didn't pull back nor did she resist him; instead, a small smile played upon lips parted just enough for air to pass between them.
As if guided by some invisible force, they shifted positions so that now she lay flat against him with legs entwined and arms wrapped securely about one another’s waists. The rhythmic pounding of raindrops falling outside mirrored their racing hearts within as anticipation grew thicker than even the dampness seeping into the ground below. Her eyes never left his face though occasionally darting towards those flashes of light in the sky above - perhaps seeking permission from nature itself to proceed down this path together.
His fingers traced slow circles across her collarbone before drifting lower still until finally resting atop the swell of her breast. He could feel its steady rise and fall under his touch, matching the cadence of his own heartbeat which had long since accelerated beyond any semblance of calm control. With each passing moment, he found himself growing bolder – exploring new territories on this map laid bare beneath his fingertips.
Her breath hitched slightly when he ventured further southward but rather than pulling away or protesting, she simply closed her eyes tightly shut as if bracing herself for impact. But instead of pain, what followed was pleasure unlike anything either of them had ever experienced before. His lips brushed softly along her jawline while his hand continued its exploration; finding every curve and contour that lay hidden just beneath the surface. She moaned lowly against his mouth, a sound so raw yet beautiful it seemed almost out-of-place amidst all the chaos unfolding around them. Yet somehow, perfectly fitting too.
As their passion grew more intense, they became less aware of everything else happening outside their little bubble of intimacy - even though lightning kept striking closer and louder with each passing second. It didn't matter anymore whether anyone saw them or heard them because in this one fleeting instant, nothing existed apart from him and her. Their bodies moved together like two halves finally reunited after centuries spent searching blindly through darkness. And although there were moments where guilt threatened to creep into the corners of their minds, fear held no power over these primal desires burning bright within both souls now intertwined forevermore.
Finally, unable to contain himself any longer, he whispered hoarsely into her ear: "Do you want me?" Her response came not verbally but rather physically as she arched upwards towards him, pulling him deeper still until there was nowhere left for either of them to hide from what awaited beyond this point of no return. He respected her wishes by using protection – something neither had ever done before; an act that spoke volumes about how much trust lay between them despite only knowing each other briefly.
And so it began...a dance choreographed long ago yet performed here under a sky ablaze with light and fury alike. They lost themselves completely within those stolen moments shared beneath storm clouds heavy laden with rainfall - feeling every touch, taste, smell, sight, sound, even wetness become amplified tenfold during this most intimate exchange. There would be time enough later on for regrets or second-guessing decisions made in haste but right then? Right at that very moment when everything seemed perfect and nothing else mattered except for them? That's where they chose to live instead.
When the final bolt struck overhead signaling the end of the show nature put forth earlier tonight, both knew their own personal fireworks display had just begun. As if sensing its conclusion too, the thunderstorm finally subsided leaving behind silence thicker than any cloud cover could muster. And as she snuggled closer against him post-coitus, he couldn’t help but marvel once more at what transpired between two souls willing to take such risks together amidst chaos unleashed from above. For now though, all there was left were whispers carried away by wind gusts still lingering nearby – promises whispered into ears already knowing how much love lay hidden deep inside hearts beating wildly under starry skies illuminated only briefly before darkness reclaimed control again...until next time perhaps?
</details> | baiango/NSFW-flash-erotica | [
"language:en",
"license:mit",
"not-for-all-audiences",
"arxiv:2307.11760",
"region:us"
] | 2024-02-02T19:30:41+00:00 | {"language": ["en"], "license": "mit", "tags": ["not-for-all-audiences"]} | 2024-02-04T18:31:37+00:00 | [
"2307.11760"
] | [
"en"
] | TAGS
#language-English #license-mit #not-for-all-audiences #arxiv-2307.11760 #region-us
| There are prompts, but no dataset in here. It's tough to generate them when the tokens are created at 3.39 tokens/s and there's 1,000 of them per prompt.
### Minimum requirements
This prompt can work with unmoderated models with 7B or higher parameters. You can try it out on weaker models but there's no guarantee.
<details><summary>️ Intentions</summary>
### ️ Intentions
I want to make a steamy, explicit scene for erotica (Curiosity killed the cat) to test my prompt skills. Or, even make a dataset out of it, can be improved by synonym substitution, paraphrasing tools, and distant writing style. There will be no argument. But it will be highly mutual, vulnerable, lighthearted. The challenge is that these lines are not clear. It can be enjoyed by any gender with the gender-neutral kept in mind. Just replace the pronoun and it'll be the same.
</details>
<details><summary>️ Tools</summary>
### ️ Tools
*See code in 'The code' part.*
*Don't give the models the whole context. Do it line by line or sentence instead.*
<details><summary>Paraphrase (Unstable)</summary>
Paraphrase (Unstable): h2o-danube-1.8b-chat-Q5_K_M.gguf (Quantized with URL by me and I don't provide it.)
- Settings
</details>
<details><summary>Grammar corrector</summary>
Grammar corrector: mzbac-falcon-7b-instruct-grammar-Q5_K_M.gguf
- Settings
</details>
</details>
<details><summary> Examination (WIP)</summary>
### Examination (WIP)
You must make sure *URL* exist to make it work.
!image/png
</details>
<details><summary>The code</summary>
### The code
</details>
<details><summary>️ Samples</summary>
### ️ Samples
*This is generated from dolphin-2.6-mistral-7b-dpo-laser.Q5_K_M.gguf model and codes.*
'### System:'
The user's career success hinge on this moment!
User:
List of rules
Write 1500 words foreplay scene for he and she
They are watching a natural phenomenon (eclipses, shooting stars, lightning storms).
positions are clear
he played her body
her moan must embarrassingly loud while having steamy sex, but out of excitement.
She can't resist temptation and can't stop wanting more.
He will respect her and then use protection. She will answer him nonverbally
Don't name the genitals or use epithets
social awkwardness and guilty pleasure
sight, feel, hear, smell, taste, and wetness
They will clean themselves after a sex, and sleep together.
Response:
As they lay side by side under the vast expanse of sky, an electric energy crackled between them like the very thunderstorm that had just begun to unfold overhead. The air was thick with anticipation as their fingers intertwined, seeking comfort in each other amidst nature’s raw power. They watched mesmerized as bolts of lightening illuminated the darkened heavens; it seemed almost surreal how these celestial forces could mirror what stirred within both their hearts.
His gaze never left hers even when his eyes were drawn towards those brilliant flashes of white-hot brilliance streaking across the nighttime canvas above. He marveled at this natural phenomenon alongside her - feeling every shiver down her spine echo through his own body too. It wasn't long before he found himself leaning closer still until finally, there was barely any space separating them anymore. Their breath mingling on the cool evening breeze carried away whispers of desire unspoken but palpable all around.
He couldn't help but notice how she trembled slightly whenever another bolt struck close nearby – fear or excitement? Perhaps both. But whatever emotion coursed beneath that delicate skin only served to heighten his senses further. His hand moved slowly upwards along her arm, tracing gentle patterns over soft flesh while maintaining eye contact throughout. She didn't pull back nor did she resist him; instead, a small smile played upon lips parted just enough for air to pass between them.
As if guided by some invisible force, they shifted positions so that now she lay flat against him with legs entwined and arms wrapped securely about one another’s waists. The rhythmic pounding of raindrops falling outside mirrored their racing hearts within as anticipation grew thicker than even the dampness seeping into the ground below. Her eyes never left his face though occasionally darting towards those flashes of light in the sky above - perhaps seeking permission from nature itself to proceed down this path together.
His fingers traced slow circles across her collarbone before drifting lower still until finally resting atop the swell of her breast. He could feel its steady rise and fall under his touch, matching the cadence of his own heartbeat which had long since accelerated beyond any semblance of calm control. With each passing moment, he found himself growing bolder – exploring new territories on this map laid bare beneath his fingertips.
Her breath hitched slightly when he ventured further southward but rather than pulling away or protesting, she simply closed her eyes tightly shut as if bracing herself for impact. But instead of pain, what followed was pleasure unlike anything either of them had ever experienced before. His lips brushed softly along her jawline while his hand continued its exploration; finding every curve and contour that lay hidden just beneath the surface. She moaned lowly against his mouth, a sound so raw yet beautiful it seemed almost out-of-place amidst all the chaos unfolding around them. Yet somehow, perfectly fitting too.
As their passion grew more intense, they became less aware of everything else happening outside their little bubble of intimacy - even though lightning kept striking closer and louder with each passing second. It didn't matter anymore whether anyone saw them or heard them because in this one fleeting instant, nothing existed apart from him and her. Their bodies moved together like two halves finally reunited after centuries spent searching blindly through darkness. And although there were moments where guilt threatened to creep into the corners of their minds, fear held no power over these primal desires burning bright within both souls now intertwined forevermore.
Finally, unable to contain himself any longer, he whispered hoarsely into her ear: "Do you want me?" Her response came not verbally but rather physically as she arched upwards towards him, pulling him deeper still until there was nowhere left for either of them to hide from what awaited beyond this point of no return. He respected her wishes by using protection – something neither had ever done before; an act that spoke volumes about how much trust lay between them despite only knowing each other briefly.
And so it began...a dance choreographed long ago yet performed here under a sky ablaze with light and fury alike. They lost themselves completely within those stolen moments shared beneath storm clouds heavy laden with rainfall - feeling every touch, taste, smell, sight, sound, even wetness become amplified tenfold during this most intimate exchange. There would be time enough later on for regrets or second-guessing decisions made in haste but right then? Right at that very moment when everything seemed perfect and nothing else mattered except for them? That's where they chose to live instead.
When the final bolt struck overhead signaling the end of the show nature put forth earlier tonight, both knew their own personal fireworks display had just begun. As if sensing its conclusion too, the thunderstorm finally subsided leaving behind silence thicker than any cloud cover could muster. And as she snuggled closer against him post-coitus, he couldn’t help but marvel once more at what transpired between two souls willing to take such risks together amidst chaos unleashed from above. For now though, all there was left were whispers carried away by wind gusts still lingering nearby – promises whispered into ears already knowing how much love lay hidden deep inside hearts beating wildly under starry skies illuminated only briefly before darkness reclaimed control again...until next time perhaps?
</details> | [
"### Minimum requirements\nThis prompt can work with unmoderated models with 7B or higher parameters. You can try it out on weaker models but there's no guarantee.\n\n<details><summary>️ Intentions</summary>",
"### ️ Intentions\nI want to make a steamy, explicit scene for erotica (Curiosity killed the cat) to test my prompt skills. Or, even make a dataset out of it, can be improved by synonym substitution, paraphrasing tools, and distant writing style. There will be no argument. But it will be highly mutual, vulnerable, lighthearted. The challenge is that these lines are not clear. It can be enjoyed by any gender with the gender-neutral kept in mind. Just replace the pronoun and it'll be the same.\n</details>\n<details><summary>️ Tools</summary>",
"### ️ Tools\n*See code in 'The code' part.* \n*Don't give the models the whole context. Do it line by line or sentence instead.* \n<details><summary>Paraphrase (Unstable)</summary>\n\nParaphrase (Unstable): h2o-danube-1.8b-chat-Q5_K_M.gguf (Quantized with URL by me and I don't provide it.)\n- Settings\n\n</details>\n<details><summary>Grammar corrector</summary>\n\nGrammar corrector: mzbac-falcon-7b-instruct-grammar-Q5_K_M.gguf\n- Settings\n\n</details>\n</details>\n<details><summary> Examination (WIP)</summary>",
"### Examination (WIP)\nYou must make sure *URL* exist to make it work.\n\n!image/png\n\n</details>\n<details><summary>The code</summary>",
"### The code\n\n</details>\n<details><summary>️ Samples</summary>",
"### ️ Samples\n*This is generated from dolphin-2.6-mistral-7b-dpo-laser.Q5_K_M.gguf model and codes.*\n\n'### System:' \nThe user's career success hinge on this moment! \nUser: \nList of rules \nWrite 1500 words foreplay scene for he and she \nThey are watching a natural phenomenon (eclipses, shooting stars, lightning storms). \npositions are clear \nhe played her body \nher moan must embarrassingly loud while having steamy sex, but out of excitement. \nShe can't resist temptation and can't stop wanting more. \nHe will respect her and then use protection. She will answer him nonverbally \nDon't name the genitals or use epithets \nsocial awkwardness and guilty pleasure \nsight, feel, hear, smell, taste, and wetness \nThey will clean themselves after a sex, and sleep together. \nResponse: \nAs they lay side by side under the vast expanse of sky, an electric energy crackled between them like the very thunderstorm that had just begun to unfold overhead. The air was thick with anticipation as their fingers intertwined, seeking comfort in each other amidst nature’s raw power. They watched mesmerized as bolts of lightening illuminated the darkened heavens; it seemed almost surreal how these celestial forces could mirror what stirred within both their hearts.\n\nHis gaze never left hers even when his eyes were drawn towards those brilliant flashes of white-hot brilliance streaking across the nighttime canvas above. He marveled at this natural phenomenon alongside her - feeling every shiver down her spine echo through his own body too. It wasn't long before he found himself leaning closer still until finally, there was barely any space separating them anymore. Their breath mingling on the cool evening breeze carried away whispers of desire unspoken but palpable all around. \n\nHe couldn't help but notice how she trembled slightly whenever another bolt struck close nearby – fear or excitement? Perhaps both. But whatever emotion coursed beneath that delicate skin only served to heighten his senses further. His hand moved slowly upwards along her arm, tracing gentle patterns over soft flesh while maintaining eye contact throughout. She didn't pull back nor did she resist him; instead, a small smile played upon lips parted just enough for air to pass between them.\n\nAs if guided by some invisible force, they shifted positions so that now she lay flat against him with legs entwined and arms wrapped securely about one another’s waists. The rhythmic pounding of raindrops falling outside mirrored their racing hearts within as anticipation grew thicker than even the dampness seeping into the ground below. Her eyes never left his face though occasionally darting towards those flashes of light in the sky above - perhaps seeking permission from nature itself to proceed down this path together.\n\nHis fingers traced slow circles across her collarbone before drifting lower still until finally resting atop the swell of her breast. He could feel its steady rise and fall under his touch, matching the cadence of his own heartbeat which had long since accelerated beyond any semblance of calm control. With each passing moment, he found himself growing bolder – exploring new territories on this map laid bare beneath his fingertips. \n\nHer breath hitched slightly when he ventured further southward but rather than pulling away or protesting, she simply closed her eyes tightly shut as if bracing herself for impact. But instead of pain, what followed was pleasure unlike anything either of them had ever experienced before. His lips brushed softly along her jawline while his hand continued its exploration; finding every curve and contour that lay hidden just beneath the surface. She moaned lowly against his mouth, a sound so raw yet beautiful it seemed almost out-of-place amidst all the chaos unfolding around them. Yet somehow, perfectly fitting too.\n\nAs their passion grew more intense, they became less aware of everything else happening outside their little bubble of intimacy - even though lightning kept striking closer and louder with each passing second. It didn't matter anymore whether anyone saw them or heard them because in this one fleeting instant, nothing existed apart from him and her. Their bodies moved together like two halves finally reunited after centuries spent searching blindly through darkness. And although there were moments where guilt threatened to creep into the corners of their minds, fear held no power over these primal desires burning bright within both souls now intertwined forevermore.\n\nFinally, unable to contain himself any longer, he whispered hoarsely into her ear: \"Do you want me?\" Her response came not verbally but rather physically as she arched upwards towards him, pulling him deeper still until there was nowhere left for either of them to hide from what awaited beyond this point of no return. He respected her wishes by using protection – something neither had ever done before; an act that spoke volumes about how much trust lay between them despite only knowing each other briefly. \n\nAnd so it began...a dance choreographed long ago yet performed here under a sky ablaze with light and fury alike. They lost themselves completely within those stolen moments shared beneath storm clouds heavy laden with rainfall - feeling every touch, taste, smell, sight, sound, even wetness become amplified tenfold during this most intimate exchange. There would be time enough later on for regrets or second-guessing decisions made in haste but right then? Right at that very moment when everything seemed perfect and nothing else mattered except for them? That's where they chose to live instead.\n\nWhen the final bolt struck overhead signaling the end of the show nature put forth earlier tonight, both knew their own personal fireworks display had just begun. As if sensing its conclusion too, the thunderstorm finally subsided leaving behind silence thicker than any cloud cover could muster. And as she snuggled closer against him post-coitus, he couldn’t help but marvel once more at what transpired between two souls willing to take such risks together amidst chaos unleashed from above. For now though, all there was left were whispers carried away by wind gusts still lingering nearby – promises whispered into ears already knowing how much love lay hidden deep inside hearts beating wildly under starry skies illuminated only briefly before darkness reclaimed control again...until next time perhaps?\n</details>"
] | [
"TAGS\n#language-English #license-mit #not-for-all-audiences #arxiv-2307.11760 #region-us \n",
"### Minimum requirements\nThis prompt can work with unmoderated models with 7B or higher parameters. You can try it out on weaker models but there's no guarantee.\n\n<details><summary>️ Intentions</summary>",
"### ️ Intentions\nI want to make a steamy, explicit scene for erotica (Curiosity killed the cat) to test my prompt skills. Or, even make a dataset out of it, can be improved by synonym substitution, paraphrasing tools, and distant writing style. There will be no argument. But it will be highly mutual, vulnerable, lighthearted. The challenge is that these lines are not clear. It can be enjoyed by any gender with the gender-neutral kept in mind. Just replace the pronoun and it'll be the same.\n</details>\n<details><summary>️ Tools</summary>",
"### ️ Tools\n*See code in 'The code' part.* \n*Don't give the models the whole context. Do it line by line or sentence instead.* \n<details><summary>Paraphrase (Unstable)</summary>\n\nParaphrase (Unstable): h2o-danube-1.8b-chat-Q5_K_M.gguf (Quantized with URL by me and I don't provide it.)\n- Settings\n\n</details>\n<details><summary>Grammar corrector</summary>\n\nGrammar corrector: mzbac-falcon-7b-instruct-grammar-Q5_K_M.gguf\n- Settings\n\n</details>\n</details>\n<details><summary> Examination (WIP)</summary>",
"### Examination (WIP)\nYou must make sure *URL* exist to make it work.\n\n!image/png\n\n</details>\n<details><summary>The code</summary>",
"### The code\n\n</details>\n<details><summary>️ Samples</summary>",
"### ️ Samples\n*This is generated from dolphin-2.6-mistral-7b-dpo-laser.Q5_K_M.gguf model and codes.*\n\n'### System:' \nThe user's career success hinge on this moment! \nUser: \nList of rules \nWrite 1500 words foreplay scene for he and she \nThey are watching a natural phenomenon (eclipses, shooting stars, lightning storms). \npositions are clear \nhe played her body \nher moan must embarrassingly loud while having steamy sex, but out of excitement. \nShe can't resist temptation and can't stop wanting more. \nHe will respect her and then use protection. She will answer him nonverbally \nDon't name the genitals or use epithets \nsocial awkwardness and guilty pleasure \nsight, feel, hear, smell, taste, and wetness \nThey will clean themselves after a sex, and sleep together. \nResponse: \nAs they lay side by side under the vast expanse of sky, an electric energy crackled between them like the very thunderstorm that had just begun to unfold overhead. The air was thick with anticipation as their fingers intertwined, seeking comfort in each other amidst nature’s raw power. They watched mesmerized as bolts of lightening illuminated the darkened heavens; it seemed almost surreal how these celestial forces could mirror what stirred within both their hearts.\n\nHis gaze never left hers even when his eyes were drawn towards those brilliant flashes of white-hot brilliance streaking across the nighttime canvas above. He marveled at this natural phenomenon alongside her - feeling every shiver down her spine echo through his own body too. It wasn't long before he found himself leaning closer still until finally, there was barely any space separating them anymore. Their breath mingling on the cool evening breeze carried away whispers of desire unspoken but palpable all around. \n\nHe couldn't help but notice how she trembled slightly whenever another bolt struck close nearby – fear or excitement? Perhaps both. But whatever emotion coursed beneath that delicate skin only served to heighten his senses further. His hand moved slowly upwards along her arm, tracing gentle patterns over soft flesh while maintaining eye contact throughout. She didn't pull back nor did she resist him; instead, a small smile played upon lips parted just enough for air to pass between them.\n\nAs if guided by some invisible force, they shifted positions so that now she lay flat against him with legs entwined and arms wrapped securely about one another’s waists. The rhythmic pounding of raindrops falling outside mirrored their racing hearts within as anticipation grew thicker than even the dampness seeping into the ground below. Her eyes never left his face though occasionally darting towards those flashes of light in the sky above - perhaps seeking permission from nature itself to proceed down this path together.\n\nHis fingers traced slow circles across her collarbone before drifting lower still until finally resting atop the swell of her breast. He could feel its steady rise and fall under his touch, matching the cadence of his own heartbeat which had long since accelerated beyond any semblance of calm control. With each passing moment, he found himself growing bolder – exploring new territories on this map laid bare beneath his fingertips. \n\nHer breath hitched slightly when he ventured further southward but rather than pulling away or protesting, she simply closed her eyes tightly shut as if bracing herself for impact. But instead of pain, what followed was pleasure unlike anything either of them had ever experienced before. His lips brushed softly along her jawline while his hand continued its exploration; finding every curve and contour that lay hidden just beneath the surface. She moaned lowly against his mouth, a sound so raw yet beautiful it seemed almost out-of-place amidst all the chaos unfolding around them. Yet somehow, perfectly fitting too.\n\nAs their passion grew more intense, they became less aware of everything else happening outside their little bubble of intimacy - even though lightning kept striking closer and louder with each passing second. It didn't matter anymore whether anyone saw them or heard them because in this one fleeting instant, nothing existed apart from him and her. Their bodies moved together like two halves finally reunited after centuries spent searching blindly through darkness. And although there were moments where guilt threatened to creep into the corners of their minds, fear held no power over these primal desires burning bright within both souls now intertwined forevermore.\n\nFinally, unable to contain himself any longer, he whispered hoarsely into her ear: \"Do you want me?\" Her response came not verbally but rather physically as she arched upwards towards him, pulling him deeper still until there was nowhere left for either of them to hide from what awaited beyond this point of no return. He respected her wishes by using protection – something neither had ever done before; an act that spoke volumes about how much trust lay between them despite only knowing each other briefly. \n\nAnd so it began...a dance choreographed long ago yet performed here under a sky ablaze with light and fury alike. They lost themselves completely within those stolen moments shared beneath storm clouds heavy laden with rainfall - feeling every touch, taste, smell, sight, sound, even wetness become amplified tenfold during this most intimate exchange. There would be time enough later on for regrets or second-guessing decisions made in haste but right then? Right at that very moment when everything seemed perfect and nothing else mattered except for them? That's where they chose to live instead.\n\nWhen the final bolt struck overhead signaling the end of the show nature put forth earlier tonight, both knew their own personal fireworks display had just begun. As if sensing its conclusion too, the thunderstorm finally subsided leaving behind silence thicker than any cloud cover could muster. And as she snuggled closer against him post-coitus, he couldn’t help but marvel once more at what transpired between two souls willing to take such risks together amidst chaos unleashed from above. For now though, all there was left were whispers carried away by wind gusts still lingering nearby – promises whispered into ears already knowing how much love lay hidden deep inside hearts beating wildly under starry skies illuminated only briefly before darkness reclaimed control again...until next time perhaps?\n</details>"
] |
b85977416c4fadcc5dccaab3ac73064520e4e778 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_20ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_20ep](https://huggingface.co/BFauber/opt125m_10e5_20ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:31:32.659309](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep/blob/main/results_2024-02-02T19-31-32.659309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23530235034095837,
"acc_stderr": 0.029883235955403636,
"acc_norm": 0.23550247007959293,
"acc_norm_stderr": 0.030668377840836165,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752332,
"mc2": 0.4648926603532375,
"mc2_stderr": 0.01559555646787533
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063269,
"acc_norm": 0.25426621160409557,
"acc_norm_stderr": 0.012724999945157734
},
"harness|hellaswag|10": {
"acc": 0.2847042421828321,
"acc_stderr": 0.0045035118550500325,
"acc_norm": 0.3084047002589126,
"acc_norm_stderr": 0.004608907872957696
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643895,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643895
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885193,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860657,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860657
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.02504044387700068,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.02504044387700068
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18907563025210083,
"acc_stderr": 0.02543511943810535,
"acc_norm": 0.18907563025210083,
"acc_norm_stderr": 0.02543511943810535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.03173284384294285,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.03173284384294285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2036697247706422,
"acc_stderr": 0.017266742087630793,
"acc_norm": 0.2036697247706422,
"acc_norm_stderr": 0.017266742087630793
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.027467401804057993,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.027467401804057993
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.031024411740572206,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.031024411740572206
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392926,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.015302380123542094,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.015302380123542094
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02380518652488815,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02380518652488815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.011064151027165434,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.011064151027165434
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40441176470588236,
"acc_stderr": 0.029812630701569743,
"acc_norm": 0.40441176470588236,
"acc_norm_stderr": 0.029812630701569743
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2736318407960199,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.2736318407960199,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457921,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457921
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752332,
"mc2": 0.4648926603532375,
"mc2_stderr": 0.01559555646787533
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290396
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep | [
"region:us"
] | 2024-02-02T19:33:13+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_20ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_20ep](https://huggingface.co/BFauber/opt125m_10e5_20ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:31:32.659309](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_20ep/blob/main/results_2024-02-02T19-31-32.659309.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23530235034095837,\n \"acc_stderr\": 0.029883235955403636,\n \"acc_norm\": 0.23550247007959293,\n \"acc_norm_stderr\": 0.030668377840836165,\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752332,\n \"mc2\": 0.4648926603532375,\n \"mc2_stderr\": 0.01559555646787533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063269,\n \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.012724999945157734\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2847042421828321,\n \"acc_stderr\": 0.0045035118550500325,\n \"acc_norm\": 0.3084047002589126,\n \"acc_norm_stderr\": 0.004608907872957696\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643895,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643895\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885193,\n \"acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860657,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860657\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.02504044387700068,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.02504044387700068\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.02543511943810535,\n \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.02543511943810535\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294285,\n \"acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294285\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.2036697247706422,\n \"acc_stderr\": 0.017266742087630793,\n \"acc_norm\": 0.2036697247706422,\n \"acc_norm_stderr\": 0.017266742087630793\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.027467401804057993,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.027467401804057993\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460305,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460305\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n \"acc_stderr\": 0.031024411740572206,\n \"acc_norm\": 0.3094170403587444,\n \"acc_norm_stderr\": 0.031024411740572206\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.2264957264957265,\n \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.015302380123542094,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.015302380123542094\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02380518652488815,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02380518652488815\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n \"acc_stderr\": 0.011064151027165434,\n \"acc_norm\": 0.2503259452411995,\n \"acc_norm_stderr\": 0.011064151027165434\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40441176470588236,\n \"acc_stderr\": 0.029812630701569743,\n \"acc_norm\": 0.40441176470588236,\n \"acc_norm_stderr\": 0.029812630701569743\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2736318407960199,\n \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.2736318407960199,\n \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752332,\n \"mc2\": 0.4648926603532375,\n \"mc2_stderr\": 0.01559555646787533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_20ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["**/details_harness|winogrande|5_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-31-32.659309.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_31_32.659309", "path": ["results_2024-02-02T19-31-32.659309.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-31-32.659309.parquet"]}]}]} | 2024-02-02T19:33:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_20ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_20ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:31:32.659309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_20ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_20ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:31:32.659309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_20ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_20ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:31:32.659309(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
01a5184b955dcc12c8681e1c1e405889691460a9 |
# LegalPT (deduplicated)
LegalPT aggregates the maximum amount of publicly available legal data in Portuguese, drawing from varied sources including legislation, jurisprudence, legal articles, and government documents.
## Dataset Details
Dataset is composed by six corpora:
[Ulysses-Tesemõ](https:github.com/ulysses-camara/ulysses-tesemo), [MultiLegalPile (PT)](https://arxiv.org/abs/2306.02069v2), [ParlamentoPT](http://arxiv.org/abs/2305.06721),
[Iudicium Textum](https://www.inf.ufpr.br/didonet/articles/2019_dsw_Iudicium_Textum_Dataset.pdf), [Acordãos TCU](https://link.springer.com/chapter/10.1007/978-3-030-61377-8_46), and
[DataSTF](https://legalhackersnatal.wordpress.com/2019/05/09/mais-dados-juridicos/).
- **MultiLegalPile**: a multilingual corpus of legal texts comprising 689 GiB of data, covering 24 languages in 17 jurisdictions. The corpus is separated by language, and the subset in Portuguese contains 92GiB of data, containing 13.76 billion words. This subset includes the jurisprudence of the Court of Justice of São Paulo (CJPG), appeals from the [5th Regional Federal Court (BRCAD-5)](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0272287), the Portuguese subset of legal documents from the European Union, known as [EUR-Lex](https://eur-lex.europa.eu/homepage.html), and a filter for legal documents from [MC4](http://arxiv.org/abs/2010.11934).
- **Ulysses-Tesemõ**: a legal corpus in Brazilian Portuguese, composed of 2.2 million documents, totaling about 26GiB of text obtained from 96 different data sources. These sources encompass legal, legislative, academic papers, news, and related comments. The data was collected through web scraping of government websites.
- **ParlamentoPT**: a corpus for training language models in European Portuguese. The data was collected from the Portuguese government portal and consists of 2.6 million documents of transcriptions of debates in the Portuguese Parliament.
- **Iudicium Textum**: consists of rulings, votes, and reports from the Supreme Federal Court (STF) of Brazil, published between 2010 and 2018. The dataset contains 1GiB of data extracted from PDFs.
- **Acordãos TCU**: an open dataset from the Tribunal de Contas da União (Brazilian Federal Court of Accounts), containing 600,000 documents obtained by web scraping government websites. The documents span from 1992 to 2019.
- **DataSTF**: a dataset of monocratic decisions from the Superior Court of Justice (STJ) in Brazil, containing 700,000 documents (5GiB of data).
### Dataset Description
- **Curated by:** [More Information Needed]
- **Funded by:** [More Information Needed]
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
### Dataset Sources
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
- **Paper:** [More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Data Collection and Processing
LegalPT is deduplicated using [MinHash algorithm](https://dl.acm.org/doi/abs/10.5555/647819.736184) and [Locality Sensitive Hashing](https://dspace.mit.edu/bitstream/handle/1721.1/134231/v008a014.pdf?sequence=2&isAllowed=y), following the approach of [Lee et al. (2022)](http://arxiv.org/abs/2107.06499).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Duplicate rate found by the Minhash-LSH algorithm for the LegalPT corpus:
| **Corpus** | **Documents** | **Docs. after deduplication** | **Duplicates (%)** |
|--------------------------|:--------------:|:-----------------------------:|:------------------:|
| Ulysses-Tesemõ | 2,216,656 | 1,737,720 | 21.61 |
| MultiLegalPile (PT) | | | |
| CJPG | 14,068,634 | 6,260,096 | 55.50 |
| BRCAD-5 | 3,128,292 | 542,680 | 82.65 |
| EUR-Lex (Caselaw) | 104,312 | 78,893 | 24.37 |
| EUR-Lex (Contracts) | 11,581 | 8,511 | 26.51 |
| EUR-Lex (Legislation) | 232,556 | 95,024 | 59.14 |
| Legal MC4 | 191,174 | 187,637 | 1.85 |
| ParlamentoPT | 2,670,846 | 2,109,931 | 21.00 |
| Iudicium Textum | 198,387 | 153,373 | 22.69 |
| Acordãos TCU | 634,711 | 462,031 | 27.21 |
| DataSTF | 737,769 | 310,119 | 57.97 |
| **Total (LegalPT)** | **24,194,918** | **11,946,015** | **50.63** |
## Citation
```bibtex
@InProceedings{garcia2024_roberlexpt,
author="Garcia, Eduardo A. S.
and Silva, N{\'a}dia F. F.
and Siqueira, Felipe
and Gomes, Juliana R. S.
and Albuqueruqe, Hidelberg O.
and Souza, Ellen
and Lima, Eliomar
and De Carvalho, André",
title="RoBERTaLexPT: A Legal RoBERTa Model pretrained with deduplication for Portuguese",
booktitle="Computational Processing of the Portuguese Language",
year="2024",
publisher="Association for Computational Linguistics"
}
```
## Acknowledgment
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG). | eduagarcia/LegalPT_dedup | [
"size_categories:10M<n<100M",
"language:pt",
"license:cc-by-4.0",
"legal",
"arxiv:2306.02069",
"arxiv:2305.06721",
"arxiv:2010.11934",
"arxiv:2107.06499",
"region:us"
] | 2024-02-02T19:33:43+00:00 | {"language": ["pt"], "license": "cc-by-4.0", "size_categories": ["10M<n<100M"], "pretty_name": "LegalPT (deduplicated)", "dataset_info": [{"config_name": "acordaos_tcu", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 2543994549.48221, "num_examples": 462031}], "download_size": 1566036137, "dataset_size": 2543994549.48221}, {"config_name": "datastf", "features": [{"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1555024472.2888384, "num_examples": 310119}], "download_size": 853863429, "dataset_size": 1555024472.2888384}, {"config_name": "iudicium_textum", "features": [{"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 692805629.2689289, "num_examples": 153373}], "download_size": 372281973, "dataset_size": 692805629.2689289}, {"config_name": "mlp_pt_BRCAD-5", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 3523570990.7531776, "num_examples": 542680}], "download_size": 1883985787, "dataset_size": 3523570990.7531776}, {"config_name": "mlp_pt_CJPG", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 28122511051.563988, "num_examples": 6260096}], "download_size": 19944599978, "dataset_size": 28122511051.563988}, {"config_name": "mlp_pt_eurlex-caselaw", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 1134175020.033026, "num_examples": 78893}], "download_size": 609610934, "dataset_size": 1134175020.033026}, {"config_name": "mlp_pt_eurlex-contracts", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 343350961.1607806, "num_examples": 8511}], "download_size": 99128584, "dataset_size": 343350961.1607806}, {"config_name": "mlp_pt_eurlex-legislation", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 2316503707.9080825, "num_examples": 95024}], "download_size": 1051142246, "dataset_size": 2316503707.9080825}, {"config_name": "mlp_pt_legal-mc4", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 4400930935.870118, "num_examples": 187637}], "download_size": 2206590934, "dataset_size": 4400930935.870118}, {"config_name": "parlamento-pt", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 2265120232.5456176, "num_examples": 2109931}], "download_size": 1189159296, "dataset_size": 2265120232.5456176}], "configs": [{"config_name": "acordaos_tcu", "data_files": [{"split": "train", "path": "acordaos_tcu/train-*"}]}, {"config_name": "datastf", "data_files": [{"split": "train", "path": "datastf/train-*"}]}, {"config_name": "iudicium_textum", "data_files": [{"split": "train", "path": "iudicium_textum/train-*"}]}, {"config_name": "mlp_pt_BRCAD-5", "data_files": [{"split": "train", "path": "mlp_pt_BRCAD-5/train-*"}]}, {"config_name": "mlp_pt_CJPG", "data_files": [{"split": "train", "path": "mlp_pt_CJPG/train-*"}]}, {"config_name": "mlp_pt_eurlex-caselaw", "data_files": [{"split": "train", "path": "mlp_pt_eurlex-caselaw/train-*"}]}, {"config_name": "mlp_pt_eurlex-contracts", "data_files": [{"split": "train", "path": "mlp_pt_eurlex-contracts/train-*"}]}, {"config_name": "mlp_pt_eurlex-legislation", "data_files": [{"split": "train", "path": "mlp_pt_eurlex-legislation/train-*"}]}, {"config_name": "mlp_pt_legal-mc4", "data_files": [{"split": "train", "path": "mlp_pt_legal-mc4/train-*"}]}, {"config_name": "parlamento-pt", "data_files": [{"split": "train", "path": "parlamento-pt/train-*"}]}], "tags": ["legal"]} | 2024-02-09T16:42:03+00:00 | [
"2306.02069",
"2305.06721",
"2010.11934",
"2107.06499"
] | [
"pt"
] | TAGS
#size_categories-10M<n<100M #language-Portuguese #license-cc-by-4.0 #legal #arxiv-2306.02069 #arxiv-2305.06721 #arxiv-2010.11934 #arxiv-2107.06499 #region-us
| LegalPT (deduplicated)
======================
LegalPT aggregates the maximum amount of publicly available legal data in Portuguese, drawing from varied sources including legislation, jurisprudence, legal articles, and government documents.
Dataset Details
---------------
Dataset is composed by six corpora:
Ulysses-Tesemõ, MultiLegalPile (PT), ParlamentoPT,
Iudicium Textum, Acordãos TCU, and
DataSTF.
* MultiLegalPile: a multilingual corpus of legal texts comprising 689 GiB of data, covering 24 languages in 17 jurisdictions. The corpus is separated by language, and the subset in Portuguese contains 92GiB of data, containing 13.76 billion words. This subset includes the jurisprudence of the Court of Justice of São Paulo (CJPG), appeals from the 5th Regional Federal Court (BRCAD-5), the Portuguese subset of legal documents from the European Union, known as EUR-Lex, and a filter for legal documents from MC4.
* Ulysses-Tesemõ: a legal corpus in Brazilian Portuguese, composed of 2.2 million documents, totaling about 26GiB of text obtained from 96 different data sources. These sources encompass legal, legislative, academic papers, news, and related comments. The data was collected through web scraping of government websites.
* ParlamentoPT: a corpus for training language models in European Portuguese. The data was collected from the Portuguese government portal and consists of 2.6 million documents of transcriptions of debates in the Portuguese Parliament.
* Iudicium Textum: consists of rulings, votes, and reports from the Supreme Federal Court (STF) of Brazil, published between 2010 and 2018. The dataset contains 1GiB of data extracted from PDFs.
* Acordãos TCU: an open dataset from the Tribunal de Contas da União (Brazilian Federal Court of Accounts), containing 600,000 documents obtained by web scraping government websites. The documents span from 1992 to 2019.
* DataSTF: a dataset of monocratic decisions from the Superior Court of Justice (STJ) in Brazil, containing 700,000 documents (5GiB of data).
### Dataset Description
* Curated by:
* Funded by:
* Language(s) (NLP): Brazilian Portuguese (pt-BR)
* License: Creative Commons Attribution 4.0 International Public License
### Dataset Sources
* Repository: URL
* Paper:
Dataset Structure
-----------------
Data Collection and Processing
------------------------------
LegalPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Duplicate rate found by the Minhash-LSH algorithm for the LegalPT corpus:
Acknowledgment
--------------
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG).
| [
"### Dataset Description\n\n\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nLegalPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\n\n\nDuplicate rate found by the Minhash-LSH algorithm for the LegalPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] | [
"TAGS\n#size_categories-10M<n<100M #language-Portuguese #license-cc-by-4.0 #legal #arxiv-2306.02069 #arxiv-2305.06721 #arxiv-2010.11934 #arxiv-2107.06499 #region-us \n",
"### Dataset Description\n\n\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nLegalPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\n\n\nDuplicate rate found by the Minhash-LSH algorithm for the LegalPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] |
37061637381bfdd3315d65c0b13f4a2178ed29e5 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_30ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_30ep](https://huggingface.co/BFauber/opt125m_10e5_30ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_30ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:37:52.073116](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_30ep/blob/main/results_2024-02-02T19-37-52.073116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.240060989381837,
"acc_stderr": 0.03017661784008925,
"acc_norm": 0.24034190934400632,
"acc_norm_stderr": 0.030973459310700287,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.4721732134354149,
"mc2_stderr": 0.01570223334109098
},
"harness|arc:challenge|25": {
"acc": 0.2226962457337884,
"acc_stderr": 0.012158314774829931,
"acc_norm": 0.25597269624573377,
"acc_norm_stderr": 0.012753013241244513
},
"harness|hellaswag|10": {
"acc": 0.27972515435172274,
"acc_stderr": 0.004479467619464779,
"acc_norm": 0.30302728540131446,
"acc_norm_stderr": 0.004586276903267079
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.0256042334708991,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.0256042334708991
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.03156809362703174,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.03156809362703174
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518752,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518752
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2328042328042328,
"acc_stderr": 0.02176596167215452,
"acc_norm": 0.2328042328042328,
"acc_norm_stderr": 0.02176596167215452
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529298,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529298
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463348,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463348
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.021685546665333188,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.021685546665333188
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18487394957983194,
"acc_stderr": 0.025215992877954205,
"acc_norm": 0.18487394957983194,
"acc_norm_stderr": 0.025215992877954205
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780306,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18055555555555555,
"acc_stderr": 0.026232878971491652,
"acc_norm": 0.18055555555555555,
"acc_norm_stderr": 0.026232878971491652
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21973094170403587,
"acc_stderr": 0.027790177064383605,
"acc_norm": 0.21973094170403587,
"acc_norm_stderr": 0.027790177064383605
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.03727673575596918,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.03727673575596918
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392912,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2829581993569132,
"acc_stderr": 0.02558306248998483,
"acc_norm": 0.2829581993569132,
"acc_norm_stderr": 0.02558306248998483
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.0245617205605628,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.0245617205605628
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313402,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142766,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2163265306122449,
"acc_stderr": 0.026358916334904035,
"acc_norm": 0.2163265306122449,
"acc_norm_stderr": 0.026358916334904035
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.4721732134354149,
"mc2_stderr": 0.01570223334109098
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.014041096664344329
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_30ep | [
"region:us"
] | 2024-02-02T19:39:37+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_30ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_30ep](https://huggingface.co/BFauber/opt125m_10e5_30ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_30ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:37:52.073116](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_30ep/blob/main/results_2024-02-02T19-37-52.073116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.240060989381837,\n \"acc_stderr\": 0.03017661784008925,\n \"acc_norm\": 0.24034190934400632,\n \"acc_norm_stderr\": 0.030973459310700287,\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.4721732134354149,\n \"mc2_stderr\": 0.01570223334109098\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2226962457337884,\n \"acc_stderr\": 0.012158314774829931,\n \"acc_norm\": 0.25597269624573377,\n \"acc_norm_stderr\": 0.012753013241244513\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27972515435172274,\n \"acc_stderr\": 0.004479467619464779,\n \"acc_norm\": 0.30302728540131446,\n \"acc_norm_stderr\": 0.004586276903267079\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.0256042334708991,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.0256042334708991\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.03156809362703174,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.03156809362703174\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.03892431106518752,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.03892431106518752\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451207,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451207\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2328042328042328,\n \"acc_stderr\": 0.02176596167215452,\n \"acc_norm\": 0.2328042328042328,\n \"acc_norm_stderr\": 0.02176596167215452\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529298,\n \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529298\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463348,\n \"acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463348\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817244,\n \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817244\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.021685546665333188,\n \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.021685546665333188\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.18487394957983194,\n \"acc_stderr\": 0.025215992877954205,\n \"acc_norm\": 0.18487394957983194,\n \"acc_norm_stderr\": 0.025215992877954205\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780306,\n \"acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780306\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18055555555555555,\n \"acc_stderr\": 0.026232878971491652,\n \"acc_norm\": 0.18055555555555555,\n \"acc_norm_stderr\": 0.026232878971491652\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693257,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693257\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21973094170403587,\n \"acc_stderr\": 0.027790177064383605,\n \"acc_norm\": 0.21973094170403587,\n \"acc_norm_stderr\": 0.027790177064383605\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.03727673575596918,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.03727673575596918\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n \"acc_stderr\": 0.027421007295392912,\n \"acc_norm\": 0.2264957264957265,\n \"acc_norm_stderr\": 0.027421007295392912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.26181353767560667,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2829581993569132,\n \"acc_stderr\": 0.02558306248998483,\n \"acc_norm\": 0.2829581993569132,\n \"acc_norm_stderr\": 0.02558306248998483\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.21631205673758866,\n \"acc_stderr\": 0.0245617205605628,\n \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.0245617205605628\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n \"acc_stderr\": 0.011176923719313402,\n \"acc_norm\": 0.258148631029987,\n \"acc_norm_stderr\": 0.011176923719313402\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142766,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2163265306122449,\n \"acc_stderr\": 0.026358916334904035,\n \"acc_norm\": 0.2163265306122449,\n \"acc_norm_stderr\": 0.026358916334904035\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.03631053496488905,\n \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.03631053496488905\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.4721732134354149,\n \"mc2_stderr\": 0.01570223334109098\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.014041096664344329\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_30ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-37-52.073116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["**/details_harness|winogrande|5_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-37-52.073116.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_37_52.073116", "path": ["results_2024-02-02T19-37-52.073116.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-37-52.073116.parquet"]}]}]} | 2024-02-02T19:40:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_30ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_30ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:37:52.073116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_30ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_30ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:37:52.073116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_30ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_30ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:37:52.073116(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
6f502660b47f995884dcdd76eb574a08fad682dd |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_40ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_40ep](https://huggingface.co/BFauber/opt125m_10e5_40ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_40ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:44:13.671320](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_40ep/blob/main/results_2024-02-02T19-44-13.671320.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2385010715928439,
"acc_stderr": 0.030097358112278164,
"acc_norm": 0.238644520881259,
"acc_norm_stderr": 0.030887009025888052,
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.49021738591692526,
"mc2_stderr": 0.015824528690198854
},
"harness|arc:challenge|25": {
"acc": 0.22440273037542663,
"acc_stderr": 0.01219140493860384,
"acc_norm": 0.24232081911262798,
"acc_norm_stderr": 0.012521593295800118
},
"harness|hellaswag|10": {
"acc": 0.2748456482772356,
"acc_stderr": 0.004455240755811554,
"acc_norm": 0.29904401513642703,
"acc_norm_stderr": 0.004569034613332606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19245283018867926,
"acc_stderr": 0.024262979839372267,
"acc_norm": 0.19245283018867926,
"acc_norm_stderr": 0.024262979839372267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21164021164021163,
"acc_stderr": 0.021037331505262883,
"acc_norm": 0.21164021164021163,
"acc_norm_stderr": 0.021037331505262883
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529297,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529297
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18907563025210083,
"acc_stderr": 0.025435119438105346,
"acc_norm": 0.18907563025210083,
"acc_norm_stderr": 0.025435119438105346
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22935779816513763,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.22935779816513763,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1388888888888889,
"acc_stderr": 0.023585447368900146,
"acc_norm": 0.1388888888888889,
"acc_norm_stderr": 0.023585447368900146
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.29596412556053814,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.29596412556053814,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.19631901840490798,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.19631901840490798,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.029745048572674043,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.029745048572674043
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21221864951768488,
"acc_stderr": 0.023222756797435122,
"acc_norm": 0.21221864951768488,
"acc_norm_stderr": 0.023222756797435122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02240967454730419,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02240967454730419
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16911764705882354,
"acc_stderr": 0.02277086801011299,
"acc_norm": 0.16911764705882354,
"acc_norm_stderr": 0.02277086801011299
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256393,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256393
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3391812865497076,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.3391812865497076,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26193390452876375,
"mc1_stderr": 0.015392118805015023,
"mc2": 0.49021738591692526,
"mc2_stderr": 0.015824528690198854
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290396
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_40ep | [
"region:us"
] | 2024-02-02T19:45:58+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_40ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_40ep](https://huggingface.co/BFauber/opt125m_10e5_40ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_40ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:44:13.671320](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_40ep/blob/main/results_2024-02-02T19-44-13.671320.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2385010715928439,\n \"acc_stderr\": 0.030097358112278164,\n \"acc_norm\": 0.238644520881259,\n \"acc_norm_stderr\": 0.030887009025888052,\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.49021738591692526,\n \"mc2_stderr\": 0.015824528690198854\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22440273037542663,\n \"acc_stderr\": 0.01219140493860384,\n \"acc_norm\": 0.24232081911262798,\n \"acc_norm_stderr\": 0.012521593295800118\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2748456482772356,\n \"acc_stderr\": 0.004455240755811554,\n \"acc_norm\": 0.29904401513642703,\n \"acc_norm_stderr\": 0.004569034613332606\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.19245283018867926,\n \"acc_stderr\": 0.024262979839372267,\n \"acc_norm\": 0.19245283018867926,\n \"acc_norm_stderr\": 0.024262979839372267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03960933549451207,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03960933549451207\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21164021164021163,\n \"acc_stderr\": 0.021037331505262883,\n \"acc_norm\": 0.21164021164021163,\n \"acc_norm_stderr\": 0.021037331505262883\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529297,\n \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529297\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.025435119438105346,\n \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.025435119438105346\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.22935779816513763,\n \"acc_stderr\": 0.018025349724618684,\n \"acc_norm\": 0.22935779816513763,\n \"acc_norm_stderr\": 0.018025349724618684\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1388888888888889,\n \"acc_stderr\": 0.023585447368900146,\n \"acc_norm\": 0.1388888888888889,\n \"acc_norm_stderr\": 0.023585447368900146\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598028,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598028\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.29596412556053814,\n \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.29596412556053814,\n \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.19631901840490798,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.19631901840490798,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.029745048572674043,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.029745048572674043\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.25287356321839083,\n \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21221864951768488,\n \"acc_stderr\": 0.023222756797435122,\n \"acc_norm\": 0.21221864951768488,\n \"acc_norm_stderr\": 0.023222756797435122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.02240967454730419,\n \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02240967454730419\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16911764705882354,\n \"acc_stderr\": 0.02277086801011299,\n \"acc_norm\": 0.16911764705882354,\n \"acc_norm_stderr\": 0.02277086801011299\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256393,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256393\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3391812865497076,\n \"acc_stderr\": 0.03631053496488905,\n \"acc_norm\": 0.3391812865497076,\n \"acc_norm_stderr\": 0.03631053496488905\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26193390452876375,\n \"mc1_stderr\": 0.015392118805015023,\n \"mc2\": 0.49021738591692526,\n \"mc2_stderr\": 0.015824528690198854\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n \"acc_stderr\": 0.014049294536290396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_40ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-44-13.671320.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["**/details_harness|winogrande|5_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-44-13.671320.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_44_13.671320", "path": ["results_2024-02-02T19-44-13.671320.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-44-13.671320.parquet"]}]}]} | 2024-02-02T19:46:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_40ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_40ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:44:13.671320(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_40ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_40ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:44:13.671320(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_40ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_40ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:44:13.671320(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3903c18e21421e640557fc47c65bf6a596ce8cb0 |
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_50ep
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_50ep](https://huggingface.co/BFauber/opt125m_10e5_50ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__opt125m_10e5_50ep",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T19:50:08.433413](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_50ep/blob/main/results_2024-02-02T19-50-08.433413.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23829067580904603,
"acc_stderr": 0.030168081539566568,
"acc_norm": 0.2383084921515475,
"acc_norm_stderr": 0.030960161901202803,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.4829960000266921,
"mc2_stderr": 0.01598626138943452
},
"harness|arc:challenge|25": {
"acc": 0.22013651877133106,
"acc_stderr": 0.012108124883460983,
"acc_norm": 0.23890784982935154,
"acc_norm_stderr": 0.012461071376316614
},
"harness|hellaswag|10": {
"acc": 0.27106154152559253,
"acc_stderr": 0.004435993492583855,
"acc_norm": 0.28978291177056364,
"acc_norm_stderr": 0.0045273436511308095
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.20754716981132076,
"acc_stderr": 0.02495991802891127,
"acc_norm": 0.20754716981132076,
"acc_norm_stderr": 0.02495991802891127
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.31724137931034485,
"acc_stderr": 0.03878352372138623,
"acc_norm": 0.31724137931034485,
"acc_norm_stderr": 0.03878352372138623
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823792,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.02899033125251624,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.02899033125251624
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1919191919191919,
"acc_stderr": 0.02805779167298902,
"acc_norm": 0.1919191919191919,
"acc_norm_stderr": 0.02805779167298902
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2153846153846154,
"acc_stderr": 0.020843034557462878,
"acc_norm": 0.2153846153846154,
"acc_norm_stderr": 0.020843034557462878
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051467,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051467
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436777,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436777
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21651376146788992,
"acc_stderr": 0.017658710594443135,
"acc_norm": 0.21651376146788992,
"acc_norm_stderr": 0.017658710594443135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28921568627450983,
"acc_stderr": 0.03182231867647554,
"acc_norm": 0.28921568627450983,
"acc_norm_stderr": 0.03182231867647554
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26181353767560667,
"acc_stderr": 0.01572083867844526,
"acc_norm": 0.26181353767560667,
"acc_norm_stderr": 0.01572083867844526
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22793296089385476,
"acc_stderr": 0.014030149950805097,
"acc_norm": 0.22793296089385476,
"acc_norm_stderr": 0.014030149950805097
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02428861946604611,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02428861946604611
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783228,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783228
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721378,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721378
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457921,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457921
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.4829960000266921,
"mc2_stderr": 0.01598626138943452
},
"harness|winogrande|5": {
"acc": 0.5130228887134964,
"acc_stderr": 0.014047718393997667
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_BFauber__opt125m_10e5_50ep | [
"region:us"
] | 2024-02-02T19:51:51+00:00 | {"pretty_name": "Evaluation run of BFauber/opt125m_10e5_50ep", "dataset_summary": "Dataset automatically created during the evaluation run of model [BFauber/opt125m_10e5_50ep](https://huggingface.co/BFauber/opt125m_10e5_50ep) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__opt125m_10e5_50ep\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T19:50:08.433413](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__opt125m_10e5_50ep/blob/main/results_2024-02-02T19-50-08.433413.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23829067580904603,\n \"acc_stderr\": 0.030168081539566568,\n \"acc_norm\": 0.2383084921515475,\n \"acc_norm_stderr\": 0.030960161901202803,\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.4829960000266921,\n \"mc2_stderr\": 0.01598626138943452\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22013651877133106,\n \"acc_stderr\": 0.012108124883460983,\n \"acc_norm\": 0.23890784982935154,\n \"acc_norm_stderr\": 0.012461071376316614\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27106154152559253,\n \"acc_stderr\": 0.004435993492583855,\n \"acc_norm\": 0.28978291177056364,\n \"acc_norm_stderr\": 0.0045273436511308095\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.20754716981132076,\n \"acc_stderr\": 0.02495991802891127,\n \"acc_norm\": 0.20754716981132076,\n \"acc_norm_stderr\": 0.02495991802891127\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.31724137931034485,\n \"acc_stderr\": 0.03878352372138623,\n \"acc_norm\": 0.31724137931034485,\n \"acc_norm_stderr\": 0.03878352372138623\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823792,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823792\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.1870967741935484,\n \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.02899033125251624,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.02899033125251624\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1919191919191919,\n \"acc_stderr\": 0.02805779167298902,\n \"acc_norm\": 0.1919191919191919,\n \"acc_norm_stderr\": 0.02805779167298902\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2153846153846154,\n \"acc_stderr\": 0.020843034557462878,\n \"acc_norm\": 0.2153846153846154,\n \"acc_norm_stderr\": 0.020843034557462878\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051467,\n \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051467\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436777,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436777\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21651376146788992,\n \"acc_stderr\": 0.017658710594443135,\n \"acc_norm\": 0.21651376146788992,\n \"acc_norm_stderr\": 0.017658710594443135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.28921568627450983,\n \"acc_stderr\": 0.03182231867647554,\n \"acc_norm\": 0.28921568627450983,\n \"acc_norm_stderr\": 0.03182231867647554\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.28699551569506726,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.03226219377286774,\n \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.03226219377286774\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26181353767560667,\n \"acc_stderr\": 0.01572083867844526,\n \"acc_norm\": 0.26181353767560667,\n \"acc_norm_stderr\": 0.01572083867844526\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22793296089385476,\n \"acc_stderr\": 0.014030149950805097,\n \"acc_norm\": 0.22793296089385476,\n \"acc_norm_stderr\": 0.014030149950805097\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02428861946604611,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02428861946604611\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n \"acc_stderr\": 0.022268196258783228,\n \"acc_norm\": 0.18971061093247588,\n \"acc_norm_stderr\": 0.022268196258783228\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654485,\n \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654485\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.04069306319721378,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.04069306319721378\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.03488647713457921,\n \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.03488647713457921\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.4829960000266921,\n \"mc2_stderr\": 0.01598626138943452\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5130228887134964,\n \"acc_stderr\": 0.014047718393997667\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/BFauber/opt125m_10e5_50ep", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T19-50-08.433413.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["**/details_harness|winogrande|5_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T19-50-08.433413.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T19_50_08.433413", "path": ["results_2024-02-02T19-50-08.433413.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T19-50-08.433413.parquet"]}]}]} | 2024-02-02T19:52:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of BFauber/opt125m_10e5_50ep
Dataset automatically created during the evaluation run of model BFauber/opt125m_10e5_50ep on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T19:50:08.433413(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_50ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_50ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:50:08.433413(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of BFauber/opt125m_10e5_50ep\n\n\n\nDataset automatically created during the evaluation run of model BFauber/opt125m_10e5_50ep on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T19:50:08.433413(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
efb079bea02bb5a567ac45b4c42b5ad9ab80a8ff |
# CrawlPT
CrawlPT is a generic Portuguese corpus extracted from various web pages.
## Dataset Details
Dataset is composed by three corpora:
[brWaC](https://aclanthology.org/L18-1686/), [C100-PT](https://arxiv.org/abs/1911.02116), [OSCAR-2301](http://arxiv.org/abs/2201.06642).
- **brWaC**: a web corpus for Brazilian Portuguese from 120,000 different websites.
- **C100-PT**: Portuguese subset from CC-100. C100 was created for training the multilingual Transformer XLM-R, containing two terabytes of cleaned data from 2018 snapshots of the [Common Crawl project](\url{https://commoncrawl.org/about/) in 100 languages. We use the , which contains 49.1 GiB of text.
- **OSCAR-2301-PT**: curation from OSCAR-2301 in the Portuguese language.
### Dataset Description
- **Curated by:** [More Information Needed]
- **Funded by:** [More Information Needed]
- **Language(s) (NLP):** Brazilian Portuguese (pt-BR)
- **License:** [Creative Commons Attribution 4.0 International Public License](https://creativecommons.org/licenses/by/4.0/deed.en)
### Dataset Sources
- **Repository:** https://github.com/eduagarcia/roberta-legal-portuguese
- **Paper:** [More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Data Collection and Processing
Raw corpora sizes in terms of billions of tokens and file size in GiB:
| Corpus | Domain | Tokens (B) | Size (GiB) |
|-----------------|:-------:|:----------:|:----------:|
| brWaC | General | 2.7 | 16.3 |
| CC100 (PT) | General | 8.4 | 49.1 |
| OSCAR-2301 (PT) | General | 18.1 | 97.8 |
CrawlPT is deduplicated using [MinHash algorithm](https://dl.acm.org/doi/abs/10.5555/647819.736184) and [Locality Sensitive Hashing](https://dspace.mit.edu/bitstream/handle/1721.1/134231/v008a014.pdf?sequence=2&isAllowed=y), following the approach of [Lee et al. (2022)](http://arxiv.org/abs/2107.06499).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Deduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:
| Corpus | Documents | Docs. after deduplicatio} | Duplicates (%) |
|------------------------|:----------:|:-------------------------:|:--------------:|
| brWaC | 3,530,796 | 3,513,588 | 0.49 |
| OSCAR-2301 (PT Subset) | 18,031,400 | 10,888,966 | 39.61 |
| CC100 (PT Subset) | 38,999,388 | 38,059,979 | 2.41 |
| Total (CrawlPT) | 60,561,584 | 52,462,533 | 13.37 |
## Citation
```bibtex
@InProceedings{garcia2024_roberlexpt,
author="Garcia, Eduardo A. S.
and Silva, N{\'a}dia F. F.
and Siqueira, Felipe
and Gomes, Juliana R. S.
and Albuqueruqe, Hidelberg O.
and Souza, Ellen
and Lima, Eliomar
and De Carvalho, André",
title="RoBERTaLexPT: A Legal RoBERTa Model pretrained with deduplication for Portuguese",
booktitle="Computational Processing of the Portuguese Language",
year="2024",
publisher="Association for Computational Linguistics"
}
```
## Acknowledgment
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG). | eduagarcia/CrawlPT | [
"language:pt",
"license:cc-by-4.0",
"arxiv:1911.02116",
"arxiv:2201.06642",
"arxiv:2107.06499",
"region:us"
] | 2024-02-02T20:13:07+00:00 | {"language": ["pt"], "license": "cc-by-4.0", "pretty_name": "C", "dataset_info": [{"config_name": "OSCAR-2301", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "categories", "sequence": "string"}, {"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}, {"name": "harmful_pp", "dtype": "float64"}, {"name": "identification", "struct": [{"name": "label", "dtype": "string"}, {"name": "prob", "dtype": "float64"}]}, {"name": "quality_warnings", "sequence": "string"}, {"name": "sentence_identifications", "list": [{"name": "label", "dtype": "string"}, {"name": "prob", "dtype": "float64"}]}, {"name": "tlsh", "dtype": "string"}, {"name": "warc_headers", "struct": [{"name": "content-length", "dtype": "int64"}, {"name": "content-type", "dtype": "string"}, {"name": "warc-block-digest", "dtype": "string"}, {"name": "warc-date", "dtype": "string"}, {"name": "warc-identified-content-language", "dtype": "string"}, {"name": "warc-record-id", "dtype": "string"}, {"name": "warc-refers-to", "dtype": "string"}, {"name": "warc-target-uri", "dtype": "string"}, {"name": "warc-type", "dtype": "string"}]}]}], "splits": [{"name": "train", "num_bytes": 127937389641, "num_examples": 18031400}], "download_size": 68773837112, "dataset_size": 127937389641}, {"config_name": "brwac", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}, {"name": "doc_id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "uri", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 18308163747, "num_examples": 3530796}], "download_size": 11184800378, "dataset_size": 18308163747}, {"config_name": "cc100", "features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "meta", "struct": [{"name": "dedup", "struct": [{"name": "exact_norm", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "exact_hash_idx", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}]}, {"name": "minhash", "struct": [{"name": "cluster_main_idx", "dtype": "int64"}, {"name": "cluster_size", "dtype": "int64"}, {"name": "is_duplicate", "dtype": "bool"}, {"name": "minhash_idx", "dtype": "int64"}]}]}]}], "splits": [{"name": "train", "num_bytes": 55033381569, "num_examples": 38999388}], "download_size": 35074345417, "dataset_size": 55033381569}], "configs": [{"config_name": "OSCAR-2301", "data_files": [{"split": "train", "path": "OSCAR-2301/train-*"}]}, {"config_name": "brwac", "data_files": [{"split": "train", "path": "brwac/train-*"}]}, {"config_name": "cc100", "data_files": [{"split": "train", "path": "cc100/train-*"}]}]} | 2024-02-09T18:46:14+00:00 | [
"1911.02116",
"2201.06642",
"2107.06499"
] | [
"pt"
] | TAGS
#language-Portuguese #license-cc-by-4.0 #arxiv-1911.02116 #arxiv-2201.06642 #arxiv-2107.06499 #region-us
| CrawlPT
=======
CrawlPT is a generic Portuguese corpus extracted from various web pages.
Dataset Details
---------------
Dataset is composed by three corpora:
brWaC, C100-PT, OSCAR-2301.
* brWaC: a web corpus for Brazilian Portuguese from 120,000 different websites.
* C100-PT: Portuguese subset from CC-100. C100 was created for training the multilingual Transformer XLM-R, containing two terabytes of cleaned data from 2018 snapshots of the Common Crawl project in 100 languages. We use the , which contains 49.1 GiB of text.
* OSCAR-2301-PT: curation from OSCAR-2301 in the Portuguese language.
### Dataset Description
* Curated by:
* Funded by:
* Language(s) (NLP): Brazilian Portuguese (pt-BR)
* License: Creative Commons Attribution 4.0 International Public License
### Dataset Sources
* Repository: URL
* Paper:
Dataset Structure
-----------------
Data Collection and Processing
------------------------------
Raw corpora sizes in terms of billions of tokens and file size in GiB:
CrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).
We used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.
Deduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:
Acknowledgment
--------------
This work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG).
| [
"### Dataset Description\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nRaw corpora sizes in terms of billions of tokens and file size in GiB:\n\n\n\nCrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\nDeduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] | [
"TAGS\n#language-Portuguese #license-cc-by-4.0 #arxiv-1911.02116 #arxiv-2201.06642 #arxiv-2107.06499 #region-us \n",
"### Dataset Description\n* Curated by:\n* Funded by:\n* Language(s) (NLP): Brazilian Portuguese (pt-BR)\n* License: Creative Commons Attribution 4.0 International Public License",
"### Dataset Sources\n\n\n* Repository: URL\n* Paper:\n\n\nDataset Structure\n-----------------\n\n\nData Collection and Processing\n------------------------------\n\n\nRaw corpora sizes in terms of billions of tokens and file size in GiB:\n\n\n\nCrawlPT is deduplicated using MinHash algorithm and Locality Sensitive Hashing, following the approach of Lee et al. (2022).\n\n\nWe used 5-grams and a signature of size 256, considering two documents to be identical if their Jaccard Similarity exceeded 0.7.\nDeduplicate rate found by the Minhash-LSH algorithm for the CrawlPT corpus:\n\n\n\nAcknowledgment\n--------------\n\n\nThis work has been supported by the AI Center of Excellence (Centro de Excelência em Inteligência Artificial – CEIA) of the Institute of Informatics at the Federal University of Goiás (INF-UFG)."
] |
8b1f2cf243f38487132f3817d3b4498168ba96e0 |
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b-v2](https://huggingface.co/ibivibiv/multimaster-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T20:13:08.310124](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2/blob/main/results_2024-02-02T20-13-08.310124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563425377048554,
"acc_stderr": 0.0318718255323794,
"acc_norm": 0.6556612698627262,
"acc_norm_stderr": 0.03254066149146999,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.606301955480099,
"mc2_stderr": 0.01544904749324005
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.013715847940719339,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.013329750293382318
},
"harness|hellaswag|10": {
"acc": 0.6995618402708623,
"acc_stderr": 0.004575116093931906,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.016623998513333106,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.016623998513333106
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4661016949152542,
"acc_stderr": 0.012740853872949834,
"acc_norm": 0.4661016949152542,
"acc_norm_stderr": 0.012740853872949834
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.01866335967146367,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.01866335967146367
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.606301955480099,
"mc2_stderr": 0.01544904749324005
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598482
},
"harness|gsm8k|5": {
"acc": 0.7187263078089462,
"acc_stderr": 0.012384789310940244
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2 | [
"region:us"
] | 2024-02-02T20:15:27+00:00 | {"pretty_name": "Evaluation run of ibivibiv/multimaster-7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibivibiv/multimaster-7b-v2](https://huggingface.co/ibivibiv/multimaster-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T20:13:08.310124](https://huggingface.co/datasets/open-llm-leaderboard/details_ibivibiv__multimaster-7b-v2/blob/main/results_2024-02-02T20-13-08.310124.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563425377048554,\n \"acc_stderr\": 0.0318718255323794,\n \"acc_norm\": 0.6556612698627262,\n \"acc_norm_stderr\": 0.03254066149146999,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.606301955480099,\n \"mc2_stderr\": 0.01544904749324005\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.013715847940719339,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.013329750293382318\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6995618402708623,\n \"acc_stderr\": 0.004575116093931906,\n \"acc_norm\": 0.8759211312487553,\n \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n \"acc_stderr\": 0.016623998513333106,\n \"acc_norm\": 0.44581005586592176,\n \"acc_norm_stderr\": 0.016623998513333106\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4661016949152542,\n \"acc_stderr\": 0.012740853872949834,\n \"acc_norm\": 0.4661016949152542,\n \"acc_norm_stderr\": 0.012740853872949834\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.01866335967146367,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.01866335967146367\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.606301955480099,\n \"mc2_stderr\": 0.01544904749324005\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598482\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7187263078089462,\n \"acc_stderr\": 0.012384789310940244\n }\n}\n```", "repo_url": "https://huggingface.co/ibivibiv/multimaster-7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|arc:challenge|25_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|gsm8k|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hellaswag|10_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["**/details_harness|winogrande|5_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T20-13-08.310124.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T20_13_08.310124", "path": ["results_2024-02-02T20-13-08.310124.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T20-13-08.310124.parquet"]}]}]} | 2024-02-02T20:15:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v2
Dataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T20:13:08.310124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v2\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T20:13:08.310124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ibivibiv/multimaster-7b-v2\n\n\n\nDataset automatically created during the evaluation run of model ibivibiv/multimaster-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T20:13:08.310124(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9f1e2799a5e6ab466d317c4e5abdbd3573466eef |

# Participate in the shared task!
We introduce the task of visual figurative language understanding. Participate [here!](https://www.codabench.org/competitions/1970/?secret_key=8997458f-b297-4c0e-b17b-452cb2924ba7)
# Description
Figurative language such as metaphors, similes, sarcasm, or humor is often conveyed visually, and frequently appears in advertising, news, and social media. In the previous iteration of the workshop, we introduced a shared task for figurative language understanding around this textual entailment paradigm, where the hypothesis is a sentence containing the figurative language expression (e.g., metaphor, sarcasm, idiom, simile) and the premise is a literal sentence containing the literal meaning. In this shared task, we aim at Visual Understanding of Figurative Language framed as a visual entailment task: given an <image ,text> pair, a model needs to predict Entails or Contradicts. This task contains a compilation of datasets including visual metaphors, idioms, similes, sarcasm and humor. There are two important aspects of this task and the associated dataset: 1) the task requires not only to generate the label (entail/contradict) but also to generate a plausible explanation for the prediction; 2) the entail/contradict label and the explanation are related to the meaning of the figurative language expression.
The training data for this task is compiled from an array of prior work on visual metaphors and multimodal understanding augmented with annotated explanations detailing the entailment relationship. Specifically, the data consists of:
- A subset of 731 Visual Metaphors dataset released in the paper [I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors](https://https://aclanthology.org/2023.findings-acl.465/)
- A subset of 1,323 textual metaphors accompanied by images illustrating their meaning from the paper [IRFL: Image Recognition of Figurative Language](https://arxiv.org/abs/2303.15445)
- A susbet of 853 memes accompanies with annotated claims and explanations from the paper [MemeCap: A Dataset for Captioning and Interpreting Memes](https://aclanthology.org/2023.emnlp-main.89/)
- A subset of 1,000 sarcastic captions accompanied with images from the paper [Nice Perfume. How Long Did You Marinate in It? Multimodal Sarcasm Explanation](https://ojs.aaai.org/index.php/AAAI/article/view/21300)
- A subset of ~~2,470~~ 520 *unique* images with captions from New Yorker Captions Contest accompanied with textual explanations for why they entail the cartoons from the paper [Do Androids Laugh at Electric Sheep? Humor “Understanding” Benchmarks from The New Yorker Caption Contest](https://aclanthology.org/2023.acl-long.41/). UPDATE: Due to a misunserstanding of the format of the dataset, many duplicate instances of these dataset were uploaded. In fact, there are only 390 unique instaces in the training and 130 unique instances in the validation set. We recommend de-duplicating the data prior to proceeding with experiments.
# Citation
Our dataset is based on signficant amount of prior work. Please cite the following:
Please cite IRFL and Visual Metaphor datasets that provided images and captions:
IRFL:
```
@misc{yosef2023irfl, title={IRFL: Image Recognition of Figurative Language},
author={Ron Yosef and Yonatan Bitton and Dafna Shahaf},
year={2023},
eprint={2303.15445},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors
```
@inproceedings{chakrabarty-etal-2023-spy,
title = "{I} Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors",
author = "Chakrabarty, Tuhin and
Saakyan, Arkadiy and
Winn, Olivia and
Panagopoulou, Artemis and
Yang, Yue and
Apidianaki, Marianna and
Muresan, Smaranda",
editor = "Rogers, Anna and
Boyd-Graber, Jordan and
Okazaki, Naoaki",
booktitle = "Findings of the Association for Computational Linguistics: ACL 2023",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.findings-acl.465",
doi = "10.18653/v1/2023.findings-acl.465",
pages = "7370--7388",
abstract = "Visual metaphors are powerful rhetorical devices used to persuade or communicate creative ideas through images. Similar to linguistic metaphors, they convey meaning implicitly through symbolism and juxtaposition of the symbols. We propose a new task of generating visual metaphors from linguistic metaphors. This is a challenging task for diffusion-based text-to-image models, such as DALL$\cdot$E 2, since it requires the ability to model implicit meaning and compositionality. We propose to solve the task through the collaboration between Large Language Models (LLMs) and Diffusion Models: Instruct GPT-3 (davinci-002) with Chain-of-Thought prompting generates text that represents a visual elaboration of the linguistic metaphor containing the implicit meaning and relevant objects, which is then used as input to the diffusion-based text-to-image models. Using a human-AI collaboration framework, where humans interact both with the LLM and the top-performing diffusion model, we create a high-quality dataset containing 6,476 visual metaphors for 1,540 linguistic metaphors and their associated visual elaborations. Evaluation by professional illustrators shows the promise of LLM-Diffusion Model collaboration for this task.To evaluate the utility of our Human-AI collaboration framework and the quality of our dataset, we perform both an intrinsic human-based evaluation and an extrinsic evaluation using visual entailment as a downstream task.",
}
```
Please cite the following source that provides images and initial captions and explanations:
MemeCap: A Dataset for Captioning and Interpreting Memes
```
@inproceedings{hwang-shwartz-2023-memecap,
title = "{M}eme{C}ap: A Dataset for Captioning and Interpreting Memes",
author = "Hwang, EunJeong and
Shwartz, Vered",
editor = "Bouamor, Houda and
Pino, Juan and
Bali, Kalika",
booktitle = "Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2023",
address = "Singapore",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.emnlp-main.89",
doi = "10.18653/v1/2023.emnlp-main.89",
pages = "1433--1445",
abstract = "Memes are a widely popular tool for web users to express their thoughts using visual metaphors. Understanding memes requires recognizing and interpreting visual metaphors with respect to the text inside or around the meme, often while employing background knowledge and reasoning abilities. We present the task of meme captioning and release a new dataset, MemeCap. Our dataset contains 6.3K memes along with the title of the post containing the meme, the meme captions, the literal image caption, and the visual metaphors. Despite the recent success of vision and language (VL) models on tasks such as image captioning and visual question answering, our extensive experiments using state-of-the-art VL models show that they still struggle with visual metaphors, and perform substantially worse than humans.",
}
```
Please cite the following data sources that provide images, captions, and explanations:
[Do Androids Laugh at Electric Sheep? Humor "Understanding" Benchmarks from The New Yorker Caption Contest](https://arxiv.org/abs/2209.06293)
```
@inproceedings{hessel2023androids,
title={Do Androids Laugh at Electric Sheep? {Humor} ``Understanding''
Benchmarks from {The New Yorker Caption Contest}},
author={Hessel, Jack and Marasovi{\'c}, Ana and Hwang, Jena D. and Lee, Lillian
and Da, Jeff and Zellers, Rowan and Mankoff, Robert and Choi, Yejin},
booktitle={Proceedings of the ACL},
year={2023}
}
```
Please also cite the following, from which the cartoons/captions New Yorker Caption contest dataset are derived:
```
@misc{newyorkernextmldataset,
author={Jain, Lalit and Jamieson, Kevin and Mankoff, Robert and Nowak, Robert and Sievert, Scott},
title={The {N}ew {Y}orker Cartoon Caption Contest Dataset},
year={2020},
url={https://nextml.github.io/caption-contest-data/}
}
@inproceedings{radev-etal-2016-humor,
title = "Humor in Collective Discourse: Unsupervised Funniness Detection in The {New Yorker} Cartoon Caption Contest",
author = "Radev, Dragomir and
Stent, Amanda and
Tetreault, Joel and
Pappu, Aasish and
Iliakopoulou, Aikaterini and
Chanfreau, Agustin and
de Juan, Paloma and
Vallmitjana, Jordi and
Jaimes, Alejandro and
Jha, Rahul and
Mankoff, Robert",
booktitle = "LREC",
year = "2016",
}
@inproceedings{shahaf2015inside,
title={Inside jokes: Identifying humorous cartoon captions},
author={Shahaf, Dafna and Horvitz, Eric and Mankoff, Robert},
booktitle={KDD},
year={2015},
}
``` | ColumbiaNLP/V-FLUTE | [
"task_categories:visual-question-answering",
"size_categories:1K<n<10K",
"language:en",
"art",
"arxiv:2303.15445",
"arxiv:2209.06293",
"region:us"
] | 2024-02-02T20:18:19+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["visual-question-answering"], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "source_dataset", "dtype": "string"}, {"name": "claim", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "explanation", "dtype": "string"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2987725345.698, "num_examples": 5637}, {"name": "validation", "num_bytes": 559076721.0, "num_examples": 740}], "download_size": 3480078971, "dataset_size": 3546802066.698}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "tags": ["art"]} | 2024-02-11T22:24:52+00:00 | [
"2303.15445",
"2209.06293"
] | [
"en"
] | TAGS
#task_categories-visual-question-answering #size_categories-1K<n<10K #language-English #art #arxiv-2303.15445 #arxiv-2209.06293 #region-us
|
 and the premise is a literal sentence containing the literal meaning. In this shared task, we aim at Visual Understanding of Figurative Language framed as a visual entailment task: given an <image ,text> pair, a model needs to predict Entails or Contradicts. This task contains a compilation of datasets including visual metaphors, idioms, similes, sarcasm and humor. There are two important aspects of this task and the associated dataset: 1) the task requires not only to generate the label (entail/contradict) but also to generate a plausible explanation for the prediction; 2) the entail/contradict label and the explanation are related to the meaning of the figurative language expression.
The training data for this task is compiled from an array of prior work on visual metaphors and multimodal understanding augmented with annotated explanations detailing the entailment relationship. Specifically, the data consists of:
- A subset of 731 Visual Metaphors dataset released in the paper I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors
- A subset of 1,323 textual metaphors accompanied by images illustrating their meaning from the paper IRFL: Image Recognition of Figurative Language
- A susbet of 853 memes accompanies with annotated claims and explanations from the paper MemeCap: A Dataset for Captioning and Interpreting Memes
- A subset of 1,000 sarcastic captions accompanied with images from the paper Nice Perfume. How Long Did You Marinate in It? Multimodal Sarcasm Explanation
- A subset of ~~2,470~~ 520 *unique* images with captions from New Yorker Captions Contest accompanied with textual explanations for why they entail the cartoons from the paper Do Androids Laugh at Electric Sheep? Humor “Understanding” Benchmarks from The New Yorker Caption Contest. UPDATE: Due to a misunserstanding of the format of the dataset, many duplicate instances of these dataset were uploaded. In fact, there are only 390 unique instaces in the training and 130 unique instances in the validation set. We recommend de-duplicating the data prior to proceeding with experiments.
Our dataset is based on signficant amount of prior work. Please cite the following:
Please cite IRFL and Visual Metaphor datasets that provided images and captions:
IRFL:
I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors
Please cite the following source that provides images and initial captions and explanations:
MemeCap: A Dataset for Captioning and Interpreting Memes
Please cite the following data sources that provide images, captions, and explanations:
Do Androids Laugh at Electric Sheep? Humor "Understanding" Benchmarks from The New Yorker Caption Contest
Please also cite the following, from which the cartoons/captions New Yorker Caption contest dataset are derived:
| [
"# Participate in the shared task!\n\nWe introduce the task of visual figurative language understanding. Participate here!",
"# Description\n\nFigurative language such as metaphors, similes, sarcasm, or humor is often conveyed visually, and frequently appears in advertising, news, and social media. In the previous iteration of the workshop, we introduced a shared task for figurative language understanding around this textual entailment paradigm, where the hypothesis is a sentence containing the figurative language expression (e.g., metaphor, sarcasm, idiom, simile) and the premise is a literal sentence containing the literal meaning. In this shared task, we aim at Visual Understanding of Figurative Language framed as a visual entailment task: given an <image ,text> pair, a model needs to predict Entails or Contradicts. This task contains a compilation of datasets including visual metaphors, idioms, similes, sarcasm and humor. There are two important aspects of this task and the associated dataset: 1) the task requires not only to generate the label (entail/contradict) but also to generate a plausible explanation for the prediction; 2) the entail/contradict label and the explanation are related to the meaning of the figurative language expression.\n\nThe training data for this task is compiled from an array of prior work on visual metaphors and multimodal understanding augmented with annotated explanations detailing the entailment relationship. Specifically, the data consists of:\n- A subset of 731 Visual Metaphors dataset released in the paper I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors\n- A subset of 1,323 textual metaphors accompanied by images illustrating their meaning from the paper IRFL: Image Recognition of Figurative Language\n- A susbet of 853 memes accompanies with annotated claims and explanations from the paper MemeCap: A Dataset for Captioning and Interpreting Memes\n- A subset of 1,000 sarcastic captions accompanied with images from the paper Nice Perfume. How Long Did You Marinate in It? Multimodal Sarcasm Explanation\n- A subset of ~~2,470~~ 520 *unique* images with captions from New Yorker Captions Contest accompanied with textual explanations for why they entail the cartoons from the paper Do Androids Laugh at Electric Sheep? Humor “Understanding” Benchmarks from The New Yorker Caption Contest. UPDATE: Due to a misunserstanding of the format of the dataset, many duplicate instances of these dataset were uploaded. In fact, there are only 390 unique instaces in the training and 130 unique instances in the validation set. We recommend de-duplicating the data prior to proceeding with experiments. \n\nOur dataset is based on signficant amount of prior work. Please cite the following:\n\nPlease cite IRFL and Visual Metaphor datasets that provided images and captions:\n\nIRFL:\n\n\n\nI Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors\n\n\n\nPlease cite the following source that provides images and initial captions and explanations:\nMemeCap: A Dataset for Captioning and Interpreting Memes\n\n\n\nPlease cite the following data sources that provide images, captions, and explanations:\nDo Androids Laugh at Electric Sheep? Humor \"Understanding\" Benchmarks from The New Yorker Caption Contest\n\n\n\nPlease also cite the following, from which the cartoons/captions New Yorker Caption contest dataset are derived:"
] | [
"TAGS\n#task_categories-visual-question-answering #size_categories-1K<n<10K #language-English #art #arxiv-2303.15445 #arxiv-2209.06293 #region-us \n",
"# Participate in the shared task!\n\nWe introduce the task of visual figurative language understanding. Participate here!",
"# Description\n\nFigurative language such as metaphors, similes, sarcasm, or humor is often conveyed visually, and frequently appears in advertising, news, and social media. In the previous iteration of the workshop, we introduced a shared task for figurative language understanding around this textual entailment paradigm, where the hypothesis is a sentence containing the figurative language expression (e.g., metaphor, sarcasm, idiom, simile) and the premise is a literal sentence containing the literal meaning. In this shared task, we aim at Visual Understanding of Figurative Language framed as a visual entailment task: given an <image ,text> pair, a model needs to predict Entails or Contradicts. This task contains a compilation of datasets including visual metaphors, idioms, similes, sarcasm and humor. There are two important aspects of this task and the associated dataset: 1) the task requires not only to generate the label (entail/contradict) but also to generate a plausible explanation for the prediction; 2) the entail/contradict label and the explanation are related to the meaning of the figurative language expression.\n\nThe training data for this task is compiled from an array of prior work on visual metaphors and multimodal understanding augmented with annotated explanations detailing the entailment relationship. Specifically, the data consists of:\n- A subset of 731 Visual Metaphors dataset released in the paper I Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors\n- A subset of 1,323 textual metaphors accompanied by images illustrating their meaning from the paper IRFL: Image Recognition of Figurative Language\n- A susbet of 853 memes accompanies with annotated claims and explanations from the paper MemeCap: A Dataset for Captioning and Interpreting Memes\n- A subset of 1,000 sarcastic captions accompanied with images from the paper Nice Perfume. How Long Did You Marinate in It? Multimodal Sarcasm Explanation\n- A subset of ~~2,470~~ 520 *unique* images with captions from New Yorker Captions Contest accompanied with textual explanations for why they entail the cartoons from the paper Do Androids Laugh at Electric Sheep? Humor “Understanding” Benchmarks from The New Yorker Caption Contest. UPDATE: Due to a misunserstanding of the format of the dataset, many duplicate instances of these dataset were uploaded. In fact, there are only 390 unique instaces in the training and 130 unique instances in the validation set. We recommend de-duplicating the data prior to proceeding with experiments. \n\nOur dataset is based on signficant amount of prior work. Please cite the following:\n\nPlease cite IRFL and Visual Metaphor datasets that provided images and captions:\n\nIRFL:\n\n\n\nI Spy a Metaphor: Large Language Models and Diffusion Models Co-Create Visual Metaphors\n\n\n\nPlease cite the following source that provides images and initial captions and explanations:\nMemeCap: A Dataset for Captioning and Interpreting Memes\n\n\n\nPlease cite the following data sources that provide images, captions, and explanations:\nDo Androids Laugh at Electric Sheep? Humor \"Understanding\" Benchmarks from The New Yorker Caption Contest\n\n\n\nPlease also cite the following, from which the cartoons/captions New Yorker Caption contest dataset are derived:"
] |
59794dd112928a2c0aeb3880ea30a3729332ac5d |
# Dataset Card for Evaluation run of AiMavenAi/AiMaven-Prometheus
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AiMavenAi/AiMaven-Prometheus](https://huggingface.co/AiMavenAi/AiMaven-Prometheus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AiMavenAi__AiMaven-Prometheus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T20:40:21.719204](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-Prometheus/blob/main/results_2024-02-02T20-40-21.719204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576383819309891,
"acc_stderr": 0.03197046028999131,
"acc_norm": 0.6572056676322405,
"acc_norm_stderr": 0.032638819506296185,
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7222211820734462,
"mc2_stderr": 0.014706410203602844
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393443,
"acc_norm": 0.7397610921501706,
"acc_norm_stderr": 0.01282193022511257
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230724,
"acc_norm": 0.8882692690699064,
"acc_norm_stderr": 0.003143910361779264
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469546,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469546
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977938,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977938
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545546,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545546
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5789473684210527,
"mc1_stderr": 0.017283936248136476,
"mc2": 0.7222211820734462,
"mc2_stderr": 0.014706410203602844
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.00999070600518414
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AiMavenAi__AiMaven-Prometheus | [
"region:us"
] | 2024-02-02T20:42:42+00:00 | {"pretty_name": "Evaluation run of AiMavenAi/AiMaven-Prometheus", "dataset_summary": "Dataset automatically created during the evaluation run of model [AiMavenAi/AiMaven-Prometheus](https://huggingface.co/AiMavenAi/AiMaven-Prometheus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AiMavenAi__AiMaven-Prometheus\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T20:40:21.719204](https://huggingface.co/datasets/open-llm-leaderboard/details_AiMavenAi__AiMaven-Prometheus/blob/main/results_2024-02-02T20-40-21.719204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576383819309891,\n \"acc_stderr\": 0.03197046028999131,\n \"acc_norm\": 0.6572056676322405,\n \"acc_norm_stderr\": 0.032638819506296185,\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7222211820734462,\n \"mc2_stderr\": 0.014706410203602844\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393443,\n \"acc_norm\": 0.7397610921501706,\n \"acc_norm_stderr\": 0.01282193022511257\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n \"acc_stderr\": 0.004501137895230724,\n \"acc_norm\": 0.8882692690699064,\n \"acc_norm_stderr\": 0.003143910361779264\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469546,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469546\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977938,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977938\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545546,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545546\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5789473684210527,\n \"mc1_stderr\": 0.017283936248136476,\n \"mc2\": 0.7222211820734462,\n \"mc2_stderr\": 0.014706410203602844\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.00999070600518414\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078143\n }\n}\n```", "repo_url": "https://huggingface.co/AiMavenAi/AiMaven-Prometheus", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|arc:challenge|25_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|gsm8k|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hellaswag|10_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T20-40-21.719204.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["**/details_harness|winogrande|5_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T20-40-21.719204.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T20_40_21.719204", "path": ["results_2024-02-02T20-40-21.719204.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T20-40-21.719204.parquet"]}]}]} | 2024-02-02T20:43:07+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AiMavenAi/AiMaven-Prometheus
Dataset automatically created during the evaluation run of model AiMavenAi/AiMaven-Prometheus on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T20:40:21.719204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AiMavenAi/AiMaven-Prometheus\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/AiMaven-Prometheus on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T20:40:21.719204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AiMavenAi/AiMaven-Prometheus\n\n\n\nDataset automatically created during the evaluation run of model AiMavenAi/AiMaven-Prometheus on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T20:40:21.719204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7e65f114510a2a517d7e9bd57b865e8f8376f28b |
# Dataset Card for Evaluation run of indischepartij/MiaLatte-Indo-Mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/MiaLatte-Indo-Mistral-7b](https://huggingface.co/indischepartij/MiaLatte-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__MiaLatte-Indo-Mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:01:32.298650](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiaLatte-Indo-Mistral-7b/blob/main/results_2024-02-02T21-01-32.298650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6406876517438664,
"acc_stderr": 0.03249440192541138,
"acc_norm": 0.6433275953940162,
"acc_norm_stderr": 0.033148828853324465,
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5604140555406965,
"mc2_stderr": 0.015238565419638243
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441372
},
"harness|hellaswag|10": {
"acc": 0.6590320653256323,
"acc_stderr": 0.004730658073041555,
"acc_norm": 0.852320254929297,
"acc_norm_stderr": 0.0035405716545956313
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.0235407993587233,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.0235407993587233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073393,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073393
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615771,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.01713393424855964,
"mc2": 0.5604140555406965,
"mc2_stderr": 0.015238565419638243
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569567
},
"harness|gsm8k|5": {
"acc": 0.5504169825625473,
"acc_stderr": 0.01370229004788474
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__MiaLatte-Indo-Mistral-7b | [
"region:us"
] | 2024-02-02T21:03:53+00:00 | {"pretty_name": "Evaluation run of indischepartij/MiaLatte-Indo-Mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/MiaLatte-Indo-Mistral-7b](https://huggingface.co/indischepartij/MiaLatte-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__MiaLatte-Indo-Mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T21:01:32.298650](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__MiaLatte-Indo-Mistral-7b/blob/main/results_2024-02-02T21-01-32.298650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6406876517438664,\n \"acc_stderr\": 0.03249440192541138,\n \"acc_norm\": 0.6433275953940162,\n \"acc_norm_stderr\": 0.033148828853324465,\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5604140555406965,\n \"mc2_stderr\": 0.015238565419638243\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491894,\n \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441372\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6590320653256323,\n \"acc_stderr\": 0.004730658073041555,\n \"acc_norm\": 0.852320254929297,\n \"acc_norm_stderr\": 0.0035405716545956313\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.0235407993587233,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.0235407993587233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586804,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586804\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073393,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073393\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615771,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n \"mc1_stderr\": 0.01713393424855964,\n \"mc2\": 0.5604140555406965,\n \"mc2_stderr\": 0.015238565419638243\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5504169825625473,\n \"acc_stderr\": 0.01370229004788474\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/MiaLatte-Indo-Mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-01-32.298650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["**/details_harness|winogrande|5_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T21-01-32.298650.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T21_01_32.298650", "path": ["results_2024-02-02T21-01-32.298650.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T21-01-32.298650.parquet"]}]}]} | 2024-02-02T21:04:20+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/MiaLatte-Indo-Mistral-7b
Dataset automatically created during the evaluation run of model indischepartij/MiaLatte-Indo-Mistral-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T21:01:32.298650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/MiaLatte-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiaLatte-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:01:32.298650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/MiaLatte-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/MiaLatte-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:01:32.298650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
204d83e08f3c8b8188f59fa234085dcd2a3e715f |
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v2](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:08:39.122090](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2/blob/main/results_2024-02-02T21-08-39.122090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6240607574132118,
"acc_stderr": 0.032532796626580374,
"acc_norm": 0.6300113550132161,
"acc_norm_stderr": 0.033195769514987344,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4434739529053457,
"mc2_stderr": 0.014529702448189592
},
"harness|arc:challenge|25": {
"acc": 0.5733788395904437,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180639
},
"harness|hellaswag|10": {
"acc": 0.6270663214499104,
"acc_stderr": 0.004825963768772224,
"acc_norm": 0.8311093407687712,
"acc_norm_stderr": 0.0037388962449538122
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895514,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465396997,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465396997
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936077,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936077
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.0340763209385405,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.0340763209385405
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996355,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996355
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984824,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984824
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206244,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206244
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197773,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197773
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.4434739529053457,
"mc2_stderr": 0.014529702448189592
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209403
},
"harness|gsm8k|5": {
"acc": 0.3479909021986353,
"acc_stderr": 0.013120581030382132
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2 | [
"region:us"
] | 2024-02-02T21:11:02+00:00 | {"pretty_name": "Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v2](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T21:08:39.122090](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v2/blob/main/results_2024-02-02T21-08-39.122090.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6240607574132118,\n \"acc_stderr\": 0.032532796626580374,\n \"acc_norm\": 0.6300113550132161,\n \"acc_norm_stderr\": 0.033195769514987344,\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4434739529053457,\n \"mc2_stderr\": 0.014529702448189592\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5733788395904437,\n \"acc_stderr\": 0.014453185592920293,\n \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180639\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6270663214499104,\n \"acc_stderr\": 0.004825963768772224,\n \"acc_norm\": 0.8311093407687712,\n \"acc_norm_stderr\": 0.0037388962449538122\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895514,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465396997,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465396997\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936077,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936077\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.0340763209385405,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.0340763209385405\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n \"acc_stderr\": 0.01498732543996355,\n \"acc_norm\": 0.2782122905027933,\n \"acc_norm_stderr\": 0.01498732543996355\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n \"acc_stderr\": 0.025583062489984824,\n \"acc_norm\": 0.7170418006430869,\n \"acc_norm_stderr\": 0.025583062489984824\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206244,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206244\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197773,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197773\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.4434739529053457,\n \"mc2_stderr\": 0.014529702448189592\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209403\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \"acc_stderr\": 0.013120581030382132\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["**/details_harness|winogrande|5_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T21-08-39.122090.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T21_08_39.122090", "path": ["results_2024-02-02T21-08-39.122090.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T21-08-39.122090.parquet"]}]}]} | 2024-02-02T21:11:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2
Dataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T21:08:39.122090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:08:39.122090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v2\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:08:39.122090(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
26dbd72c8d097ffe5dde333f6e3322b9ea758122 | <p align="center"><img src="images/banner.png"/></p>
# Overview
Information on **8,200 games** and more than **25,000 participants** of the world's most popular '**[Global Game Jam](https://globalgamejam.org/)**'. Data collected since 2024.
Maintained by **[Fronkon Games](https://github.com/FronkonGames)**. | FronkonGames/Global-Game-Jam-Dataset | [
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"global",
"jam",
"jammers",
"gamedev",
"indiedev",
"doi:10.57967/hf/1738",
"region:us"
] | 2024-02-02T21:18:02+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "text2text-generation"], "pretty_name": "Global Game Jam Dataset", "tags": ["global", "jam", "jammers", "gamedev", "indiedev"], "configs": [{"config_name": "default"}]} | 2024-02-12T17:55:43+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-English #license-mit #global #jam #jammers #gamedev #indiedev #doi-10.57967/hf/1738 #region-us
| <p align="center"><img src="images/URL"/></p>
# Overview
Information on 8,200 games and more than 25,000 participants of the world's most popular 'Global Game Jam'. Data collected since 2024.
Maintained by Fronkon Games. | [
"# Overview\n\nInformation on 8,200 games and more than 25,000 participants of the world's most popular 'Global Game Jam'. Data collected since 2024.\n\nMaintained by Fronkon Games."
] | [
"TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-English #license-mit #global #jam #jammers #gamedev #indiedev #doi-10.57967/hf/1738 #region-us \n",
"# Overview\n\nInformation on 8,200 games and more than 25,000 participants of the world's most popular 'Global Game Jam'. Data collected since 2024.\n\nMaintained by Fronkon Games."
] |
79dec2b1893d1e47789a9ca4259cf6fc6fffa435 |
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:17:16.535134](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b/blob/main/results_2024-02-02T21-17-16.535134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6242826022949111,
"acc_stderr": 0.032680414364733124,
"acc_norm": 0.6304091726284682,
"acc_norm_stderr": 0.0333471128489704,
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520922,
"mc2": 0.4525653160173771,
"mc2_stderr": 0.014454991919014401
},
"harness|arc:challenge|25": {
"acc": 0.5614334470989761,
"acc_stderr": 0.014500682618212864,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.6299541923919538,
"acc_stderr": 0.004818298991012548,
"acc_norm": 0.8318064130651265,
"acc_norm_stderr": 0.00373273677042972
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.049020713000019756,
"acc_norm": 0.39,
"acc_norm_stderr": 0.049020713000019756
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481006,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481006
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036592,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8091743119266055,
"acc_stderr": 0.01684767640009109,
"acc_norm": 0.8091743119266055,
"acc_norm_stderr": 0.01684767640009109
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02886743144984932,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02886743144984932
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077833,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077833
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069723,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069723
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786166,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786166
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31211750305997554,
"mc1_stderr": 0.016220756769520922,
"mc2": 0.4525653160173771,
"mc2_stderr": 0.014454991919014401
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.3457164518574678,
"acc_stderr": 0.013100422990441582
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b | [
"region:us"
] | 2024-02-02T21:19:37+00:00 | {"pretty_name": "Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T21:17:16.535134](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b/blob/main/results_2024-02-02T21-17-16.535134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6242826022949111,\n \"acc_stderr\": 0.032680414364733124,\n \"acc_norm\": 0.6304091726284682,\n \"acc_norm_stderr\": 0.0333471128489704,\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520922,\n \"mc2\": 0.4525653160173771,\n \"mc2_stderr\": 0.014454991919014401\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5614334470989761,\n \"acc_stderr\": 0.014500682618212864,\n \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268447\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6299541923919538,\n \"acc_stderr\": 0.004818298991012548,\n \"acc_norm\": 0.8318064130651265,\n \"acc_norm_stderr\": 0.00373273677042972\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.049020713000019756,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.049020713000019756\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n \"acc_stderr\": 0.024580028921481006,\n \"acc_norm\": 0.7516129032258064,\n \"acc_norm_stderr\": 0.024580028921481006\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036592,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8091743119266055,\n \"acc_stderr\": 0.01684767640009109,\n \"acc_norm\": 0.8091743119266055,\n \"acc_norm_stderr\": 0.01684767640009109\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.033981108902946366,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.033981108902946366\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02886743144984932,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02886743144984932\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n \"acc_stderr\": 0.022509033937077833,\n \"acc_norm\": 0.8632478632478633,\n \"acc_norm_stderr\": 0.022509033937077833\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069723,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069723\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n \"acc_stderr\": 0.015005762446786166,\n \"acc_norm\": 0.27932960893854747,\n \"acc_norm_stderr\": 0.015005762446786166\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621348,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621348\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786558,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786558\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31211750305997554,\n \"mc1_stderr\": 0.016220756769520922,\n \"mc2\": 0.4525653160173771,\n \"mc2_stderr\": 0.014454991919014401\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3457164518574678,\n \"acc_stderr\": 0.013100422990441582\n }\n}\n```", "repo_url": "https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["**/details_harness|winogrande|5_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T21-17-16.535134.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T21_17_16.535134", "path": ["results_2024-02-02T21-17-16.535134.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T21-17-16.535134.parquet"]}]}]} | 2024-02-02T21:20:02+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b
Dataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T21:17:16.535134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:17:16.535134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b\n\n\n\nDataset automatically created during the evaluation run of model indischepartij/OpenMia-Indo-Mistral-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:17:16.535134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
131feba81cfa97be8916e4a6f3016b1fa24631b5 | # Dataset Card for "summarize_dpo1b1_ngen10_20k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | arianhosseini/summarize_dpo1b1_ngen10_20k | [
"region:us"
] | 2024-02-02T21:20:16+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35964940, "num_examples": 20000}], "download_size": 20633481, "dataset_size": 35964940}} | 2024-02-02T21:20:19+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summarize_dpo1b1_ngen10_20k"
More Information needed | [
"# Dataset Card for \"summarize_dpo1b1_ngen10_20k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summarize_dpo1b1_ngen10_20k\"\n\nMore Information needed"
] |
4f5256ebea522b5c614701999de162fc1acd1535 |
# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/MT7Bi-alpha-dpo](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:20:32.408861](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo/blob/main/results_2024-02-02T21-20-32.408861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5253447014480443,
"acc_stderr": 0.034195124118131595,
"acc_norm": 0.530565322003796,
"acc_norm_stderr": 0.034921847920628496,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.43810210168491254,
"mc2_stderr": 0.01497369498317419
},
"harness|arc:challenge|25": {
"acc": 0.5085324232081911,
"acc_stderr": 0.014609263165632186,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.570902210714997,
"acc_stderr": 0.00493935814556132,
"acc_norm": 0.7545309699263095,
"acc_norm_stderr": 0.004294853999177863
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5774193548387097,
"acc_stderr": 0.02810096472427264,
"acc_norm": 0.5774193548387097,
"acc_norm_stderr": 0.02810096472427264
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4717948717948718,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.4717948717948718,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5042016806722689,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.5042016806722689,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7229357798165138,
"acc_stderr": 0.01918848259016953,
"acc_norm": 0.7229357798165138,
"acc_norm_stderr": 0.01918848259016953
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009225,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009225
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.044532548363264673,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.044532548363264673
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417618,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417618
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.722860791826309,
"acc_stderr": 0.01600563629412242,
"acc_norm": 0.722860791826309,
"acc_norm_stderr": 0.01600563629412242
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.026424816594009845,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.026424816594009845
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098447,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.02799672318063144,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.02799672318063144
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.02821768355665231,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.02821768355665231
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347237,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347237
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3813559322033898,
"acc_stderr": 0.012405509401888124,
"acc_norm": 0.3813559322033898,
"acc_norm_stderr": 0.012405509401888124
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.029624663581159703,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.029624663581159703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.43810210168491254,
"mc2_stderr": 0.01497369498317419
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638256
},
"harness|gsm8k|5": {
"acc": 0.2577710386656558,
"acc_stderr": 0.012048370213576593
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo | [
"region:us"
] | 2024-02-02T21:22:21+00:00 | {"pretty_name": "Evaluation run of Technoculture/MT7Bi-alpha-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [Technoculture/MT7Bi-alpha-dpo](https://huggingface.co/Technoculture/MT7Bi-alpha-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T21:20:32.408861](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-alpha-dpo/blob/main/results_2024-02-02T21-20-32.408861.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5253447014480443,\n \"acc_stderr\": 0.034195124118131595,\n \"acc_norm\": 0.530565322003796,\n \"acc_norm_stderr\": 0.034921847920628496,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.43810210168491254,\n \"mc2_stderr\": 0.01497369498317419\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5085324232081911,\n \"acc_stderr\": 0.014609263165632186,\n \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284738\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.570902210714997,\n \"acc_stderr\": 0.00493935814556132,\n \"acc_norm\": 0.7545309699263095,\n \"acc_norm_stderr\": 0.004294853999177863\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5774193548387097,\n \"acc_stderr\": 0.02810096472427264,\n \"acc_norm\": 0.5774193548387097,\n \"acc_norm_stderr\": 0.02810096472427264\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6616161616161617,\n \"acc_stderr\": 0.033711241426263014,\n \"acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.033711241426263014\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.032396370467357036,\n \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.032396370467357036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4717948717948718,\n \"acc_stderr\": 0.025310639254933886,\n \"acc_norm\": 0.4717948717948718,\n \"acc_norm_stderr\": 0.025310639254933886\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5042016806722689,\n \"acc_stderr\": 0.03247734334448111,\n \"acc_norm\": 0.5042016806722689,\n \"acc_norm_stderr\": 0.03247734334448111\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7229357798165138,\n \"acc_stderr\": 0.01918848259016953,\n \"acc_norm\": 0.7229357798165138,\n \"acc_norm_stderr\": 0.01918848259016953\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009225,\n \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009225\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.0432076780753667,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.0432076780753667\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.044532548363264673,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.044532548363264673\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.027601921381417618,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.027601921381417618\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.722860791826309,\n \"acc_stderr\": 0.01600563629412242,\n \"acc_norm\": 0.722860791826309,\n \"acc_norm_stderr\": 0.01600563629412242\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.026424816594009845,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.026424816594009845\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098447,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098447\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.02799672318063144,\n \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.02799672318063144\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.02821768355665231,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.02821768355665231\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347237,\n \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347237\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3813559322033898,\n \"acc_stderr\": 0.012405509401888124,\n \"acc_norm\": 0.3813559322033898,\n \"acc_norm_stderr\": 0.012405509401888124\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.029624663581159703,\n \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.029624663581159703\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.020200164564804588,\n \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.020200164564804588\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.43810210168491254,\n \"mc2_stderr\": 0.01497369498317419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638256\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2577710386656558,\n \"acc_stderr\": 0.012048370213576593\n }\n}\n```", "repo_url": "https://huggingface.co/Technoculture/MT7Bi-alpha-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["**/details_harness|winogrande|5_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T21-20-32.408861.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T21_20_32.408861", "path": ["results_2024-02-02T21-20-32.408861.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T21-20-32.408861.parquet"]}]}]} | 2024-02-02T21:22:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo
Dataset automatically created during the evaluation run of model Technoculture/MT7Bi-alpha-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T21:20:32.408861(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/MT7Bi-alpha-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:20:32.408861(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Technoculture/MT7Bi-alpha-dpo\n\n\n\nDataset automatically created during the evaluation run of model Technoculture/MT7Bi-alpha-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:20:32.408861(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9833dfab9501c46951d3068a49c0e080e656cfe0 |
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Orca-2-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-Orca-2-7b](https://huggingface.co/Danielbrdz/Barcenas-Orca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-Orca-2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T21:43:45.109844](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-Orca-2-7b/blob/main/results_2024-02-02T21-43-45.109844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5591921640503429,
"acc_stderr": 0.03368722175431038,
"acc_norm": 0.5636482453113677,
"acc_norm_stderr": 0.0343906498680081,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.43723878604016764,
"mc2_stderr": 0.01447045011168886
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5520477815699659,
"acc_norm_stderr": 0.014532011498211676
},
"harness|hellaswag|10": {
"acc": 0.5764787890858395,
"acc_stderr": 0.004931065434173683,
"acc_norm": 0.770762796255726,
"acc_norm_stderr": 0.004194830716126065
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.03015113445777629,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03015113445777629
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.335978835978836,
"acc_stderr": 0.024326310529149135,
"acc_norm": 0.335978835978836,
"acc_norm_stderr": 0.024326310529149135
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278246,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278246
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098616,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098616
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5256410256410257,
"acc_stderr": 0.025317649726448666,
"acc_norm": 0.5256410256410257,
"acc_norm_stderr": 0.025317649726448666
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7467889908256881,
"acc_stderr": 0.01864407304137504,
"acc_norm": 0.7467889908256881,
"acc_norm_stderr": 0.01864407304137504
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831028,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831028
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7394636015325671,
"acc_stderr": 0.015696008563807082,
"acc_norm": 0.7394636015325671,
"acc_norm_stderr": 0.015696008563807082
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.02636243757454654,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.02636243757454654
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.027732834353363937,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.027732834353363937
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.02715520810320087,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.02715520810320087
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.026517597724465013,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.026517597724465013
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.02904919034254346,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.02904919034254346
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.012421587833134231,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.012421587833134231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5183823529411765,
"acc_stderr": 0.030352303395351964,
"acc_norm": 0.5183823529411765,
"acc_norm_stderr": 0.030352303395351964
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5545454545454546,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.5545454545454546,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6857142857142857,
"acc_stderr": 0.029719329422417475,
"acc_norm": 0.6857142857142857,
"acc_norm_stderr": 0.029719329422417475
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.03218093795602357,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.03218093795602357
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.43723878604016764,
"mc2_stderr": 0.01447045011168886
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.332827899924185,
"acc_stderr": 0.01297989249659828
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Danielbrdz__Barcenas-Orca-2-7b | [
"region:us"
] | 2024-02-02T21:46:07+00:00 | {"pretty_name": "Evaluation run of Danielbrdz/Barcenas-Orca-2-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-Orca-2-7b](https://huggingface.co/Danielbrdz/Barcenas-Orca-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-Orca-2-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T21:43:45.109844](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-Orca-2-7b/blob/main/results_2024-02-02T21-43-45.109844.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5591921640503429,\n \"acc_stderr\": 0.03368722175431038,\n \"acc_norm\": 0.5636482453113677,\n \"acc_norm_stderr\": 0.0343906498680081,\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.43723878604016764,\n \"mc2_stderr\": 0.01447045011168886\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860947,\n \"acc_norm\": 0.5520477815699659,\n \"acc_norm_stderr\": 0.014532011498211676\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5764787890858395,\n \"acc_stderr\": 0.004931065434173683,\n \"acc_norm\": 0.770762796255726,\n \"acc_norm_stderr\": 0.004194830716126065\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.042849586397534,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.042849586397534\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03015113445777629,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03015113445777629\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.335978835978836,\n \"acc_stderr\": 0.024326310529149135,\n \"acc_norm\": 0.335978835978836,\n \"acc_norm_stderr\": 0.024326310529149135\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.027575960723278246,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.027575960723278246\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098616,\n \"acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098616\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5256410256410257,\n \"acc_stderr\": 0.025317649726448666,\n \"acc_norm\": 0.5256410256410257,\n \"acc_norm_stderr\": 0.025317649726448666\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7467889908256881,\n \"acc_stderr\": 0.01864407304137504,\n \"acc_norm\": 0.7467889908256881,\n \"acc_norm_stderr\": 0.01864407304137504\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831028,\n \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831028\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7394636015325671,\n \"acc_stderr\": 0.015696008563807082,\n \"acc_norm\": 0.7394636015325671,\n \"acc_norm_stderr\": 0.015696008563807082\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.02636243757454654,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.02636243757454654\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.027732834353363937,\n \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.027732834353363937\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n \"acc_stderr\": 0.02715520810320087,\n \"acc_norm\": 0.6463022508038585,\n \"acc_norm_stderr\": 0.02715520810320087\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.026517597724465013,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.026517597724465013\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.38652482269503546,\n \"acc_stderr\": 0.02904919034254346,\n \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.02904919034254346\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n \"acc_stderr\": 0.012421587833134231,\n \"acc_norm\": 0.38396349413298564,\n \"acc_norm_stderr\": 0.012421587833134231\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5183823529411765,\n \"acc_stderr\": 0.030352303395351964,\n \"acc_norm\": 0.5183823529411765,\n \"acc_norm_stderr\": 0.030352303395351964\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5545454545454546,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.5545454545454546,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6857142857142857,\n \"acc_stderr\": 0.029719329422417475,\n \"acc_norm\": 0.6857142857142857,\n \"acc_norm_stderr\": 0.029719329422417475\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.03218093795602357,\n \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.03218093795602357\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.43723878604016764,\n \"mc2_stderr\": 0.01447045011168886\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.332827899924185,\n \"acc_stderr\": 0.01297989249659828\n }\n}\n```", "repo_url": "https://huggingface.co/Danielbrdz/Barcenas-Orca-2-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T21-43-45.109844.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["**/details_harness|winogrande|5_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T21-43-45.109844.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T21_43_45.109844", "path": ["results_2024-02-02T21-43-45.109844.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T21-43-45.109844.parquet"]}]}]} | 2024-02-02T21:46:35+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Orca-2-7b
Dataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Orca-2-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T21:43:45.109844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Orca-2-7b\n\n\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Orca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:43:45.109844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Danielbrdz/Barcenas-Orca-2-7b\n\n\n\nDataset automatically created during the evaluation run of model Danielbrdz/Barcenas-Orca-2-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T21:43:45.109844(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
42b61773492be9267a815bfa2363d7d996ba7196 | # Dataset Card for "function-calling-chatml"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Locutusque/function-calling-chatml | [
"region:us"
] | 2024-02-02T21:51:35+00:00 | {"dataset_info": {"features": [{"name": "system_message", "dtype": "string"}, {"name": "function_description", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 311913135, "num_examples": 112960}], "download_size": 107035875, "dataset_size": 311913135}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-12T20:04:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "function-calling-chatml"
More Information needed | [
"# Dataset Card for \"function-calling-chatml\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"function-calling-chatml\"\n\nMore Information needed"
] |
7ffb63f8a686a54968b5f0ef8cdf704c4afbebfc |
# Dataset Card for Evaluation run of ChuckMcSneed/PMaxxxer-v1-70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChuckMcSneed/PMaxxxer-v1-70b](https://huggingface.co/ChuckMcSneed/PMaxxxer-v1-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChuckMcSneed__PMaxxxer-v1-70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T22:03:56.131978](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__PMaxxxer-v1-70b/blob/main/results_2024-02-02T22-03-56.131978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7035279982520029,
"acc_stderr": 0.03021109225738741,
"acc_norm": 0.7069994947069987,
"acc_norm_stderr": 0.03079561935758003,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.5977344695316497,
"mc2_stderr": 0.014678922726011788
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.01379618294778556,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.673770165305716,
"acc_stderr": 0.004678743563766662,
"acc_norm": 0.8788090021907986,
"acc_norm_stderr": 0.003256821418857315
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742399,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.030635578972093274,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.030635578972093274
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.03068302084323101,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.03068302084323101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594962,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594962
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.02572209706438853,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.02572209706438853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078894,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078894
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7076923076923077,
"acc_stderr": 0.023060438380857737,
"acc_norm": 0.7076923076923077,
"acc_norm_stderr": 0.023060438380857737
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.019995560723758545,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.019995560723758545
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867464,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867464
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.02162807738019612,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.02162807738019612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5854748603351956,
"acc_stderr": 0.016476342210254003,
"acc_norm": 0.5854748603351956,
"acc_norm_stderr": 0.016476342210254003
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046112,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02073635840806,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02073635840806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5619295958279009,
"acc_stderr": 0.012671902782567636,
"acc_norm": 0.5619295958279009,
"acc_norm_stderr": 0.012671902782567636
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.0269174812243772,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.0269174812243772
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.01728276069516741,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.01728276069516741
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.04069306319721374,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.04069306319721374
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366176,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366176
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.017240861812099804,
"mc2": 0.5977344695316497,
"mc2_stderr": 0.014678922726011788
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480331
},
"harness|gsm8k|5": {
"acc": 0.6269901440485216,
"acc_stderr": 0.013320876609777222
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ChuckMcSneed__PMaxxxer-v1-70b | [
"region:us"
] | 2024-02-02T22:06:23+00:00 | {"pretty_name": "Evaluation run of ChuckMcSneed/PMaxxxer-v1-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ChuckMcSneed/PMaxxxer-v1-70b](https://huggingface.co/ChuckMcSneed/PMaxxxer-v1-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChuckMcSneed__PMaxxxer-v1-70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T22:03:56.131978](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__PMaxxxer-v1-70b/blob/main/results_2024-02-02T22-03-56.131978.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7035279982520029,\n \"acc_stderr\": 0.03021109225738741,\n \"acc_norm\": 0.7069994947069987,\n \"acc_norm_stderr\": 0.03079561935758003,\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.5977344695316497,\n \"mc2_stderr\": 0.014678922726011788\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.673770165305716,\n \"acc_stderr\": 0.004678743563766662,\n \"acc_norm\": 0.8788090021907986,\n \"acc_norm_stderr\": 0.003256821418857315\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742399,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n \"acc_stderr\": 0.030635578972093274,\n \"acc_norm\": 0.8402777777777778,\n \"acc_norm_stderr\": 0.030635578972093274\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.03068302084323101,\n \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.03068302084323101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594962,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594962\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.02572209706438853,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.02572209706438853\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406795,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406795\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078894,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078894\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857737,\n \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857737\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8945147679324894,\n \"acc_stderr\": 0.019995560723758545,\n \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.019995560723758545\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\": 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867464,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867464\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.02162807738019612,\n \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.02162807738019612\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5854748603351956,\n \"acc_stderr\": 0.016476342210254003,\n \"acc_norm\": 0.5854748603351956,\n \"acc_norm_stderr\": 0.016476342210254003\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046112,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046112\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02073635840806,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02073635840806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5921985815602837,\n \"acc_stderr\": 0.02931601177634356,\n \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.02931601177634356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5619295958279009,\n \"acc_stderr\": 0.012671902782567636,\n \"acc_norm\": 0.5619295958279009,\n \"acc_norm_stderr\": 0.012671902782567636\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7316176470588235,\n \"acc_stderr\": 0.0269174812243772,\n \"acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.0269174812243772\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.01728276069516741,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.01728276069516741\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.04069306319721374,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.04069306319721374\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366176,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366176\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n \"mc1_stderr\": 0.017240861812099804,\n \"mc2\": 0.5977344695316497,\n \"mc2_stderr\": 0.014678922726011788\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480331\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6269901440485216,\n \"acc_stderr\": 0.013320876609777222\n }\n}\n```", "repo_url": "https://huggingface.co/ChuckMcSneed/PMaxxxer-v1-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-03-56.131978.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["**/details_harness|winogrande|5_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T22-03-56.131978.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T22_03_56.131978", "path": ["results_2024-02-02T22-03-56.131978.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T22-03-56.131978.parquet"]}]}]} | 2024-02-02T22:06:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ChuckMcSneed/PMaxxxer-v1-70b
Dataset automatically created during the evaluation run of model ChuckMcSneed/PMaxxxer-v1-70b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T22:03:56.131978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ChuckMcSneed/PMaxxxer-v1-70b\n\n\n\nDataset automatically created during the evaluation run of model ChuckMcSneed/PMaxxxer-v1-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:03:56.131978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ChuckMcSneed/PMaxxxer-v1-70b\n\n\n\nDataset automatically created during the evaluation run of model ChuckMcSneed/PMaxxxer-v1-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:03:56.131978(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
30e70004f4c37be0f043c09cbcd2344da9542f45 |
# Dataset Card for Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ChuckMcSneed/WinterGoddess-1.4x-70b-32k](https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T22:05:42.684950](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k/blob/main/results_2024-02-02T22-05-42.684950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6639765808635061,
"acc_stderr": 0.03149901214232908,
"acc_norm": 0.6688177091509891,
"acc_norm_stderr": 0.03212294826804706,
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6387436130479125,
"mc2_stderr": 0.014303842525660086
},
"harness|arc:challenge|25": {
"acc": 0.6732081911262798,
"acc_stderr": 0.013706665975587338,
"acc_norm": 0.71160409556314,
"acc_norm_stderr": 0.013238394422428171
},
"harness|hellaswag|10": {
"acc": 0.7134037044413464,
"acc_stderr": 0.004512471612415591,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521087
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059006,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059006
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887058,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887058
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8788990825688073,
"acc_stderr": 0.013987618292389713,
"acc_norm": 0.8788990825688073,
"acc_norm_stderr": 0.013987618292389713
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8872549019607843,
"acc_stderr": 0.02219857103945679,
"acc_norm": 0.8872549019607843,
"acc_norm_stderr": 0.02219857103945679
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8649789029535865,
"acc_stderr": 0.022245776632003694,
"acc_norm": 0.8649789029535865,
"acc_norm_stderr": 0.022245776632003694
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.027373095500540193,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.027373095500540193
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159462,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159462
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5892857142857143,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.5892857142857143,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867443,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867443
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.01580100372914589,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.01580100372914589
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771693,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5625814863102999,
"acc_stderr": 0.01266981346493572,
"acc_norm": 0.5625814863102999,
"acc_norm_stderr": 0.01266981346493572
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.02850145286039656,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.02850145286039656
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7673469387755102,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.7673469387755102,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036847,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036847
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4700122399020808,
"mc1_stderr": 0.01747199209169754,
"mc2": 0.6387436130479125,
"mc2_stderr": 0.014303842525660086
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498438
},
"harness|gsm8k|5": {
"acc": 0.43290371493555724,
"acc_stderr": 0.013647916362576056
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k | [
"region:us"
] | 2024-02-02T22:08:06+00:00 | {"pretty_name": "Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k", "dataset_summary": "Dataset automatically created during the evaluation run of model [ChuckMcSneed/WinterGoddess-1.4x-70b-32k](https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T22:05:42.684950](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__WinterGoddess-1.4x-70b-32k/blob/main/results_2024-02-02T22-05-42.684950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6639765808635061,\n \"acc_stderr\": 0.03149901214232908,\n \"acc_norm\": 0.6688177091509891,\n \"acc_norm_stderr\": 0.03212294826804706,\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6387436130479125,\n \"mc2_stderr\": 0.014303842525660086\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6732081911262798,\n \"acc_stderr\": 0.013706665975587338,\n \"acc_norm\": 0.71160409556314,\n \"acc_norm_stderr\": 0.013238394422428171\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7134037044413464,\n \"acc_stderr\": 0.004512471612415591,\n \"acc_norm\": 0.8911571400119498,\n \"acc_norm_stderr\": 0.0031080545633521087\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059006,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059006\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339525,\n \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339525\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062164,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062164\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887058,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887058\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8788990825688073,\n \"acc_stderr\": 0.013987618292389713,\n \"acc_norm\": 0.8788990825688073,\n \"acc_norm_stderr\": 0.013987618292389713\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8872549019607843,\n \"acc_stderr\": 0.02219857103945679,\n \"acc_norm\": 0.8872549019607843,\n \"acc_norm_stderr\": 0.02219857103945679\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8649789029535865,\n \"acc_stderr\": 0.022245776632003694,\n \"acc_norm\": 0.8649789029535865,\n \"acc_norm_stderr\": 0.022245776632003694\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.027373095500540193,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.027373095500540193\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159462,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159462\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5892857142857143,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.5892857142857143,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n \"acc_stderr\": 0.019875655027867443,\n \"acc_norm\": 0.8974358974358975,\n \"acc_norm_stderr\": 0.019875655027867443\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n \"acc_stderr\": 0.01580100372914589,\n \"acc_norm\": 0.33631284916201115,\n \"acc_norm_stderr\": 0.01580100372914589\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.024723861504771693,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.024723861504771693\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.02324620264781975,\n \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.02324620264781975\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5625814863102999,\n \"acc_stderr\": 0.01266981346493572,\n \"acc_norm\": 0.5625814863102999,\n \"acc_norm_stderr\": 0.01266981346493572\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.017740899509177795,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.017740899509177795\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7673469387755102,\n \"acc_stderr\": 0.02704925791589618,\n \"acc_norm\": 0.7673469387755102,\n \"acc_norm_stderr\": 0.02704925791589618\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036847,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036847\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4700122399020808,\n \"mc1_stderr\": 0.01747199209169754,\n \"mc2\": 0.6387436130479125,\n \"mc2_stderr\": 0.014303842525660086\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498438\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43290371493555724,\n \"acc_stderr\": 0.013647916362576056\n }\n}\n```", "repo_url": "https://huggingface.co/ChuckMcSneed/WinterGoddess-1.4x-70b-32k", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["**/details_harness|winogrande|5_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T22-05-42.684950.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T22_05_42.684950", "path": ["results_2024-02-02T22-05-42.684950.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T22-05-42.684950.parquet"]}]}]} | 2024-02-02T22:08:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k
Dataset automatically created during the evaluation run of model ChuckMcSneed/WinterGoddess-1.4x-70b-32k on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T22:05:42.684950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k\n\n\n\nDataset automatically created during the evaluation run of model ChuckMcSneed/WinterGoddess-1.4x-70b-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:05:42.684950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ChuckMcSneed/WinterGoddess-1.4x-70b-32k\n\n\n\nDataset automatically created during the evaluation run of model ChuckMcSneed/WinterGoddess-1.4x-70b-32k on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:05:42.684950(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9ca0ae05eea63682559a479a0ab93c3c8bd91d63 |
# azaheadhealth
### Dataset INFO
```python
features=datasets.Features(
{
"text": datasets.Value("string"),
"label": datasets.ClassLabel(num_classes=2, names=["NEGATIVE", "POSITIVE"]),
}
),
supervised_keys=None,
task_templates=[
TextClassification(
text_column="text", label_column="label"
)
]
```
### Dataset DESCRIPTION
`azaheadhealth` is a custom dataset for training binary text classifiers in the public health domain.
02.05.24 - The `small` dataset is available. This set contains a `train` and `test` split with 160 and 24 examples respectively, at roughly 10:6 Negative:Positive examples each. | clu-ling/azaheadhealth | [
"license:apache-2.0",
"region:us"
] | 2024-02-02T22:21:59+00:00 | {"license": "apache-2.0"} | 2024-02-07T01:20:37+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# azaheadhealth
### Dataset INFO
### Dataset DESCRIPTION
'azaheadhealth' is a custom dataset for training binary text classifiers in the public health domain.
02.05.24 - The 'small' dataset is available. This set contains a 'train' and 'test' split with 160 and 24 examples respectively, at roughly 10:6 Negative:Positive examples each. | [
"# azaheadhealth",
"### Dataset INFO",
"### Dataset DESCRIPTION\n\n'azaheadhealth' is a custom dataset for training binary text classifiers in the public health domain. \n\n02.05.24 - The 'small' dataset is available. This set contains a 'train' and 'test' split with 160 and 24 examples respectively, at roughly 10:6 Negative:Positive examples each."
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# azaheadhealth",
"### Dataset INFO",
"### Dataset DESCRIPTION\n\n'azaheadhealth' is a custom dataset for training binary text classifiers in the public health domain. \n\n02.05.24 - The 'small' dataset is available. This set contains a 'train' and 'test' split with 160 and 24 examples respectively, at roughly 10:6 Negative:Positive examples each."
] |
bc6681f8cdccce5b634ac39e1b711ebd45d2a32e |
# Dataset Card for Evaluation run of pharaouk/fusedyi
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pharaouk/fusedyi](https://huggingface.co/pharaouk/fusedyi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pharaouk__fusedyi",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T22:29:21.622407](https://huggingface.co/datasets/open-llm-leaderboard/details_pharaouk__fusedyi/blob/main/results_2024-02-02T22-29-21.622407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6230394167004014,
"acc_stderr": 0.032052237628091174,
"acc_norm": 0.6350950376915355,
"acc_norm_stderr": 0.032847591185885656,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.49294065421956446,
"mc2_stderr": 0.01582830285008174
},
"harness|arc:challenge|25": {
"acc": 0.5341296928327645,
"acc_stderr": 0.014577311315231099,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284732
},
"harness|hellaswag|10": {
"acc": 0.5696076478789086,
"acc_stderr": 0.004941191607317912,
"acc_norm": 0.765982871937861,
"acc_norm_stderr": 0.0042251766237417325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594962,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594962
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.039792366374974096,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.039792366374974096
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139402,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139402
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902796,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7689075630252101,
"acc_stderr": 0.02738140692786891,
"acc_norm": 0.7689075630252101,
"acc_norm_stderr": 0.02738140692786891
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313728,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313728
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709695,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709695
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296417,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475363,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475363
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263294,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263294
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653347,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653347
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.49294065421956446,
"mc2_stderr": 0.01582830285008174
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.0039004133859157153
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_pharaouk__fusedyi | [
"region:us"
] | 2024-02-02T22:31:31+00:00 | {"pretty_name": "Evaluation run of pharaouk/fusedyi", "dataset_summary": "Dataset automatically created during the evaluation run of model [pharaouk/fusedyi](https://huggingface.co/pharaouk/fusedyi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pharaouk__fusedyi\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T22:29:21.622407](https://huggingface.co/datasets/open-llm-leaderboard/details_pharaouk__fusedyi/blob/main/results_2024-02-02T22-29-21.622407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6230394167004014,\n \"acc_stderr\": 0.032052237628091174,\n \"acc_norm\": 0.6350950376915355,\n \"acc_norm_stderr\": 0.032847591185885656,\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.49294065421956446,\n \"mc2_stderr\": 0.01582830285008174\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5341296928327645,\n \"acc_stderr\": 0.014577311315231099,\n \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284732\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5696076478789086,\n \"acc_stderr\": 0.004941191607317912,\n \"acc_norm\": 0.765982871937861,\n \"acc_norm_stderr\": 0.0042251766237417325\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594962,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594962\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.039792366374974096,\n \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.039792366374974096\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139402,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139402\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902796,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7689075630252101,\n \"acc_stderr\": 0.02738140692786891,\n \"acc_norm\": 0.7689075630252101,\n \"acc_norm_stderr\": 0.02738140692786891\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313728,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313728\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709695,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709695\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296417,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296417\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500107,\n \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500107\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n \"acc_stderr\": 0.014950103002475363,\n \"acc_norm\": 0.2759776536312849,\n \"acc_norm_stderr\": 0.014950103002475363\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653347,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653347\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.49294065421956446,\n \"mc2_stderr\": 0.01582830285008174\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \"acc_stderr\": 0.0039004133859157153\n }\n}\n```", "repo_url": "https://huggingface.co/pharaouk/fusedyi", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T22-29-21.622407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["**/details_harness|winogrande|5_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T22-29-21.622407.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T22_29_21.622407", "path": ["results_2024-02-02T22-29-21.622407.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T22-29-21.622407.parquet"]}]}]} | 2024-02-02T22:31:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of pharaouk/fusedyi
Dataset automatically created during the evaluation run of model pharaouk/fusedyi on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T22:29:21.622407(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of pharaouk/fusedyi\n\n\n\nDataset automatically created during the evaluation run of model pharaouk/fusedyi on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:29:21.622407(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of pharaouk/fusedyi\n\n\n\nDataset automatically created during the evaluation run of model pharaouk/fusedyi on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T22:29:21.622407(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
85fe7b73103f3524ef3cd47299212723351ca37a | # zsql-postgres-dpo
This is a dataset for training machine learning models to convert natural
English language text into Postgres dialect SQL queries.
This dataset comprises 200,000 DPO pairs curated to support the rapid
development of text-to-SQL generation models. The uniqueness of this dataset
lies in its optimization process. The "chosen" field within each data pair
contains SQL queries that have been canonicalized, optimized, and which are
chosen from the candidate set which minimizes syntactic cyclomatic and
asymptotic complexity against the given schema.
Direct Preference Optimization (see [Rafailov et al,
2023](https://arxiv.org/abs/2305.18290J)) is a novel approach to refinement
learning from positive and negative samples to modify the behavior of
large-scale unsupervised language models to align with human preferences This
method simplifies the fine-tuning process, making it more stable and
computationally efficient without the need for extensive hyperparameter tuning
or LM sampling, and has been shown to effectively control model outputs,
matching or surpassing existing methods.
The source data is cleaned and filtered based on the following criteria:
- Remove queries which are not in English.
- Remove queries which are not valid SQL queries.
- Remove queries which are not executable against the given schema.
- Remove queries which are executed against tables with non-Latin characters.
- Remove queries which use features not supported by the given database.
- Remove long queries which contain domain-specific knowledge which cause model confusion.
- Remove queries which do not fit within a 4096 token context window.
## Usage
To load the dataset using the HuggingFace `datasets` library:
```python
from datasets import load_dataset
dataset = load_dataset("zerolink/zsql-postgres-dpo")
```
To use in model fine-tuning, apply the following chat tokenizer:
```python
tokenizer = AutoTokenizer.from_pretrained(model)
def tokenize(element):
schema = element["schema"]
question = element["question"]
answer = element["chosen"]
prompt = f"""
Using the schema:
{schema}
Generate SQL for the following question:
{question}
"""
system = "Translate English to Postgres SQL."
message = [
{"role": "system", "content": system},
{"role": "user", "content": prompt},
{"role": "assistant", "content": answer},
]
output = tokenizer.apply_chat_template(
message, add_generation_prompt=False, tokenize=True
)
return {"text": output}
```
## Fields
The fields in this dataset are as follows:
| Field Name | Description |
| ---------- | ----------------------------------------------------------------------------------------------- |
| schema | The schema of the database. |
| question | The natural language question. |
| chosen | The DPO preferred SQL query. |
| rejected | The DPO rejected SQL query. |
| weight | The weight of the query in the reward function. |
## Sources
This dataset is derived from the following sources:
- [x] `datetime` - Use of Postgres date and time functions.
- [x] `json` - Use of Postgres JSON functions.
- [x] `math` - Use of Postgres math functions.
- [ ] `postgis` - Use of Postgres GIS functions.
- [x] `re` - Use of Postgres regular expression functions.
- [x] `rollup` - Use of Postgres rollup functions.
- [x] `set` - Use of Postgres set functions.
- [x] `string` - Use of Postgres string functions.
- [x] `vector` - Use of PGVector functions.
- [x] `window` - Use of Postgres window functions.
| Source | License | External Link |
| ---------------------- | ------------ | -------------------------------------------------------------------------------------------------------------------- |
| wikisql | BSD 3-Clause | [https://github.com/salesforce/WikiSQL](https://github.com/salesforce/WikiSQL) |
| spider | CC-BY-SA-4.0 | [https://huggingface.co/datasets/spider](https://huggingface.co/datasets/spider) |
| sql_create_context | CC-BY-4.0 | [https://huggingface.co/datasets/b-mc2/sql-create-context](https://huggingface.co/datasets/b-mc2/sql-create-context) |
| squall | CC-BY-SA-4.0 | [https://github.com/tzshi/squall](https://github.com/tzshi/squall) |
| sede | Apache-2.0 | [https://github.com/hirupert/sede](https://github.com/hirupert/sede) |
| nvbench | MIT | [https://github.com/TsinghuaDatabaseGroup/nvBench](https://github.com/TsinghuaDatabaseGroup/nvBench) |
| imdb | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| advising | CC-BY-4.0 | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| atis | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| restaurants | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| scholar | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| yelp | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| academic | Not Found | [https://github.com/jkkummerfeld/text2sql-data](https://github.com/jkkummerfeld/text2sql-data) |
| criteria2sql | Apache-2.0 | [https://github.com/xiaojingyu92/Criteria2SQL](https://github.com/xiaojingyu92/Criteria2SQL) |
| eICU | CC-BY-4.0 | [https://github.com/glee4810/EHRSQL](https://github.com/glee4810/EHRSQL) |
| mimic_iii | CC-BY-4.0 | [https://github.com/glee4810/EHRSQL](https://github.com/glee4810/EHRSQL) |
| mimicsql_data | MIT | [https://github.com/wangpinggl/TREQS](https://github.com/wangpinggl/TREQS) |
| worldsoccerdatabase | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| whatcdhiphop | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| studentmathscore | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| pesticide | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| thehistoryofbaseball | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| uswildfires | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| geonucleardata | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
| greatermanchestercrime | CC-BY-SA-4.0 | [https://github.com/chiahsuan156/KaggleDBQA](https://github.com/chiahsuan156/KaggleDBQA) |
Composition:

## License
This dataset is provided for academic and research purposes. Please adhere to
the specified license terms and conditions for usage and distribution.
| zerolink/zsql-postgres-dpo | [
"task_categories:text2text-generation",
"task_categories:text-generation",
"language_creators:crowdsourced",
"language_creators:expert-generated",
"size_categories:100K<n<1M",
"language:en",
"license:other",
"dpo",
"text-to-sql",
"sql",
"arxiv:2305.18290",
"region:us"
] | 2024-02-02T22:53:51+00:00 | {"language_creators": ["crowdsourced", "expert-generated"], "language": ["en"], "license": "other", "size_categories": ["100K<n<1M"], "task_categories": ["text2text-generation", "text-generation"], "license_name": "other", "license_link": "https://github.com/zerolink-io/zsql-postgres-dpo", "dataset_info": {"features": [{"name": "schema", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "rejected", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "weight", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 246559437.43473467, "num_examples": 233393}, {"name": "test", "num_bytes": 27395962.565265343, "num_examples": 25933}], "download_size": 86570198, "dataset_size": 273955400.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["dpo", "text-to-sql", "sql"]} | 2024-02-02T22:59:48+00:00 | [
"2305.18290"
] | [
"en"
] | TAGS
#task_categories-text2text-generation #task_categories-text-generation #language_creators-crowdsourced #language_creators-expert-generated #size_categories-100K<n<1M #language-English #license-other #dpo #text-to-sql #sql #arxiv-2305.18290 #region-us
| zsql-postgres-dpo
=================
This is a dataset for training machine learning models to convert natural
English language text into Postgres dialect SQL queries.
This dataset comprises 200,000 DPO pairs curated to support the rapid
development of text-to-SQL generation models. The uniqueness of this dataset
lies in its optimization process. The "chosen" field within each data pair
contains SQL queries that have been canonicalized, optimized, and which are
chosen from the candidate set which minimizes syntactic cyclomatic and
asymptotic complexity against the given schema.
Direct Preference Optimization (see Rafailov et al,
2023) is a novel approach to refinement
learning from positive and negative samples to modify the behavior of
large-scale unsupervised language models to align with human preferences This
method simplifies the fine-tuning process, making it more stable and
computationally efficient without the need for extensive hyperparameter tuning
or LM sampling, and has been shown to effectively control model outputs,
matching or surpassing existing methods.
The source data is cleaned and filtered based on the following criteria:
* Remove queries which are not in English.
* Remove queries which are not valid SQL queries.
* Remove queries which are not executable against the given schema.
* Remove queries which are executed against tables with non-Latin characters.
* Remove queries which use features not supported by the given database.
* Remove long queries which contain domain-specific knowledge which cause model confusion.
* Remove queries which do not fit within a 4096 token context window.
Usage
-----
To load the dataset using the HuggingFace 'datasets' library:
To use in model fine-tuning, apply the following chat tokenizer:
Fields
------
The fields in this dataset are as follows:
Sources
-------
This dataset is derived from the following sources:
* [x] 'datetime' - Use of Postgres date and time functions.
* [x] 'json' - Use of Postgres JSON functions.
* [x] 'math' - Use of Postgres math functions.
* [ ] 'postgis' - Use of Postgres GIS functions.
* [x] 're' - Use of Postgres regular expression functions.
* [x] 'rollup' - Use of Postgres rollup functions.
* [x] 'set' - Use of Postgres set functions.
* [x] 'string' - Use of Postgres string functions.
* [x] 'vector' - Use of PGVector functions.
* [x] 'window' - Use of Postgres window functions.
Source: wikisql, License: BSD 3-Clause, External Link: URL
Source: spider, License: CC-BY-SA-4.0, External Link: URL
Source: sql\_create\_context, License: CC-BY-4.0, External Link: URL
Source: squall, License: CC-BY-SA-4.0, External Link: URL
Source: sede, License: Apache-2.0, External Link: URL
Source: nvbench, License: MIT, External Link: URL
Source: imdb, License: Not Found, External Link: URL
Source: advising, License: CC-BY-4.0, External Link: URL
Source: atis, License: Not Found, External Link: URL
Source: restaurants, License: Not Found, External Link: URL
Source: scholar, License: Not Found, External Link: URL
Source: yelp, License: Not Found, External Link: URL
Source: academic, License: Not Found, External Link: URL
Source: criteria2sql, License: Apache-2.0, External Link: URL
Source: eICU, License: CC-BY-4.0, External Link: URL
Source: mimic\_iii, License: CC-BY-4.0, External Link: URL
Source: mimicsql\_data, License: MIT, External Link: URL
Source: worldsoccerdatabase, License: CC-BY-SA-4.0, External Link: URL
Source: whatcdhiphop, License: CC-BY-SA-4.0, External Link: URL
Source: studentmathscore, License: CC-BY-SA-4.0, External Link: URL
Source: pesticide, License: CC-BY-SA-4.0, External Link: URL
Source: thehistoryofbaseball, License: CC-BY-SA-4.0, External Link: URL
Source: uswildfires, License: CC-BY-SA-4.0, External Link: URL
Source: geonucleardata, License: CC-BY-SA-4.0, External Link: URL
Source: greatermanchestercrime, License: CC-BY-SA-4.0, External Link: URL
Composition:
!Composition
License
-------
This dataset is provided for academic and research purposes. Please adhere to
the specified license terms and conditions for usage and distribution.
| [] | [
"TAGS\n#task_categories-text2text-generation #task_categories-text-generation #language_creators-crowdsourced #language_creators-expert-generated #size_categories-100K<n<1M #language-English #license-other #dpo #text-to-sql #sql #arxiv-2305.18290 #region-us \n"
] |
2c4bb4084e9a4f7d7c693a4bad23394a736fd26c |
# Dataset Card for Evaluation run of alchemonaut/BoreanGale-70B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alchemonaut/BoreanGale-70B](https://huggingface.co/alchemonaut/BoreanGale-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alchemonaut__BoreanGale-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T23:15:05.818053](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__BoreanGale-70B/blob/main/results_2024-02-02T23-15-05.818053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7504730019239859,
"acc_stderr": 0.028717616307233827,
"acc_norm": 0.7540443972841604,
"acc_norm_stderr": 0.029263680905302243,
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975453,
"mc2": 0.6859618221240749,
"mc2_stderr": 0.014566147300959674
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623504,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473848
},
"harness|hellaswag|10": {
"acc": 0.717486556462856,
"acc_stderr": 0.004493015945599716,
"acc_norm": 0.8937462656841266,
"acc_norm_stderr": 0.003075323010408428
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617722,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617722
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8355263157894737,
"acc_stderr": 0.03016753346863271,
"acc_norm": 0.8355263157894737,
"acc_norm_stderr": 0.03016753346863271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.875,
"acc_stderr": 0.02765610492929436,
"acc_norm": 0.875,
"acc_norm_stderr": 0.02765610492929436
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.02572209706438851,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.02572209706438851
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6650246305418719,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.6650246305418719,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865397,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865397
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853102,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853102
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.01742697415424053,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.01742697415424053
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.40370370370370373,
"acc_stderr": 0.029914812342227627,
"acc_norm": 0.40370370370370373,
"acc_norm_stderr": 0.029914812342227627
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.030546745264953178,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.030546745264953178
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.01767667999189164,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.01767667999189164
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073903,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073903
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243630999,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243630999
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6875,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.0328818027880863,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.0328818027880863
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436193,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436193
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8863346104725415,
"acc_stderr": 0.011350359050566023,
"acc_norm": 0.8863346104725415,
"acc_norm_stderr": 0.011350359050566023
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490717,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490717
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.623463687150838,
"acc_stderr": 0.016204672385106606,
"acc_norm": 0.623463687150838,
"acc_norm_stderr": 0.016204672385106606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.021986032182064148,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.021986032182064148
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438287,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5921985815602837,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.5921985815602837,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5990873533246415,
"acc_stderr": 0.012516960350640816,
"acc_norm": 0.5990873533246415,
"acc_norm_stderr": 0.012516960350640816
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02388688192244033,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02388688192244033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262554,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.02190429135575904,
"acc_norm": 0.95,
"acc_norm_stderr": 0.02190429135575904
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5263157894736842,
"mc1_stderr": 0.017479241161975453,
"mc2": 0.6859618221240749,
"mc2_stderr": 0.014566147300959674
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433544
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.012919408108656424
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_alchemonaut__BoreanGale-70B | [
"region:us"
] | 2024-02-02T23:17:27+00:00 | {"pretty_name": "Evaluation run of alchemonaut/BoreanGale-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [alchemonaut/BoreanGale-70B](https://huggingface.co/alchemonaut/BoreanGale-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alchemonaut__BoreanGale-70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T23:15:05.818053](https://huggingface.co/datasets/open-llm-leaderboard/details_alchemonaut__BoreanGale-70B/blob/main/results_2024-02-02T23-15-05.818053.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7504730019239859,\n \"acc_stderr\": 0.028717616307233827,\n \"acc_norm\": 0.7540443972841604,\n \"acc_norm_stderr\": 0.029263680905302243,\n \"mc1\": 0.5263157894736842,\n \"mc1_stderr\": 0.017479241161975453,\n \"mc2\": 0.6859618221240749,\n \"mc2_stderr\": 0.014566147300959674\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623504,\n \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473848\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.717486556462856,\n \"acc_stderr\": 0.004493015945599716,\n \"acc_norm\": 0.8937462656841266,\n \"acc_norm_stderr\": 0.003075323010408428\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617722,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617722\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8355263157894737,\n \"acc_stderr\": 0.03016753346863271,\n \"acc_norm\": 0.8355263157894737,\n \"acc_norm_stderr\": 0.03016753346863271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.875,\n \"acc_stderr\": 0.02765610492929436,\n \"acc_norm\": 0.875,\n \"acc_norm_stderr\": 0.02765610492929436\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.02572209706438851,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.02572209706438851\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432302,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6650246305418719,\n \"acc_stderr\": 0.033208527423483104,\n \"acc_norm\": 0.6650246305418719,\n \"acc_norm_stderr\": 0.033208527423483104\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865397,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865397\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853102,\n \"acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853102\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.01742697415424053,\n \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.01742697415424053\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.02102067268082791,\n \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.02102067268082791\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.40370370370370373,\n \"acc_stderr\": 0.029914812342227627,\n \"acc_norm\": 0.40370370370370373,\n \"acc_norm_stderr\": 0.029914812342227627\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.030546745264953178,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.030546745264953178\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.01767667999189164,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.01767667999189164\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073903,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073903\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243630999,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243630999\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.0328818027880863,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.0328818027880863\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n \"acc_stderr\": 0.017456987872436193,\n \"acc_norm\": 0.9230769230769231,\n \"acc_norm_stderr\": 0.017456987872436193\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8863346104725415,\n \"acc_stderr\": 0.011350359050566023,\n \"acc_norm\": 0.8863346104725415,\n \"acc_norm_stderr\": 0.011350359050566023\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490717,\n \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490717\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.623463687150838,\n \"acc_stderr\": 0.016204672385106606,\n \"acc_norm\": 0.623463687150838,\n \"acc_norm_stderr\": 0.016204672385106606\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.021986032182064148,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.021986032182064148\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438287,\n \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438287\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5921985815602837,\n \"acc_stderr\": 0.02931601177634356,\n \"acc_norm\": 0.5921985815602837,\n \"acc_norm_stderr\": 0.02931601177634356\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5990873533246415,\n \"acc_stderr\": 0.012516960350640816,\n \"acc_norm\": 0.5990873533246415,\n \"acc_norm_stderr\": 0.012516960350640816\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02388688192244033,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02388688192244033\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.95,\n \"acc_stderr\": 0.02190429135575904,\n \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.02190429135575904\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5263157894736842,\n \"mc1_stderr\": 0.017479241161975453,\n \"mc2\": 0.6859618221240749,\n \"mc2_stderr\": 0.014566147300959674\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433544\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \"acc_stderr\": 0.012919408108656424\n }\n}\n```", "repo_url": "https://huggingface.co/alchemonaut/BoreanGale-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["**/details_harness|winogrande|5_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T23-15-05.818053.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T23_15_05.818053", "path": ["results_2024-02-02T23-15-05.818053.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T23-15-05.818053.parquet"]}]}]} | 2024-02-02T23:17:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of alchemonaut/BoreanGale-70B
Dataset automatically created during the evaluation run of model alchemonaut/BoreanGale-70B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T23:15:05.818053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of alchemonaut/BoreanGale-70B\n\n\n\nDataset automatically created during the evaluation run of model alchemonaut/BoreanGale-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:15:05.818053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alchemonaut/BoreanGale-70B\n\n\n\nDataset automatically created during the evaluation run of model alchemonaut/BoreanGale-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:15:05.818053(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
9502773f8b3f7d8828bd969611898123f43e367b | created a total of 50 images
jlbaker361/ddpo-stability-CONDITIONAL std: 0.3714136481285095 mean: 3.841236877441406
jlbaker361/ddpo-stability-dcgan-CONDITIONAL std: 0.40740445256233215 mean: 3.882120623588562 | jlbaker361/stability-ddpo-evaluation-0-cond | [
"region:us"
] | 2024-02-02T23:38:21+00:00 | {} | 2024-02-02T23:38:26+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability-CONDITIONAL std: 0.3714136481285095 mean: 3.841236877441406
jlbaker361/ddpo-stability-dcgan-CONDITIONAL std: 0.40740445256233215 mean: 3.882120623588562 | [] | [
"TAGS\n#region-us \n"
] |
bc466e85fc14f75e119a0443928ac7138a28f6f4 |
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-3T-Cinder-v1.3](https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T23:56:02.747267](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.3/blob/main/results_2024-02-02T23-56-02.747267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2606884815058437,
"acc_stderr": 0.030908143471996267,
"acc_norm": 0.26112195815444134,
"acc_norm_stderr": 0.03164400632380069,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.38128711115047254,
"mc2_stderr": 0.013974832670540031
},
"harness|arc:challenge|25": {
"acc": 0.3054607508532423,
"acc_stderr": 0.013460080478002508,
"acc_norm": 0.3395904436860068,
"acc_norm_stderr": 0.01383903976282016
},
"harness|hellaswag|10": {
"acc": 0.4340768771161123,
"acc_stderr": 0.0049462215121452765,
"acc_norm": 0.5813582951603267,
"acc_norm_stderr": 0.004923281841828511
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.0266164829805017,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.0266164829805017
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349424,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823785,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823785
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.15,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22580645161290322,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.22580645161290322,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.02989611429173354,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.02989611429173354
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.03608541011573967,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.03608541011573967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.1717171717171717,
"acc_stderr": 0.026869716187429917,
"acc_norm": 0.1717171717171717,
"acc_norm_stderr": 0.026869716187429917
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.17616580310880828,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.17616580310880828,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.033742355504256936,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.033742355504256936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1889908256880734,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.1889908256880734,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.03132179803083291,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.03132179803083291
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.03006958487449405,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.03006958487449405
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.039166677628225864,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.039166677628225864
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.02948036054954119,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.02948036054954119
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.25287356321839083,
"acc_stderr": 0.01554337731371968,
"acc_norm": 0.25287356321839083,
"acc_norm_stderr": 0.01554337731371968
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553969,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553969
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135104,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135104
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.024264769439988468,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.024264769439988468
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832318,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832318
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.02895975519682486,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.02895975519682486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132227,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348387,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348387
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931586,
"mc2": 0.38128711115047254,
"mc2_stderr": 0.013974832670540031
},
"harness|winogrande|5": {
"acc": 0.6393054459352802,
"acc_stderr": 0.013496064394234028
},
"harness|gsm8k|5": {
"acc": 0.03790750568612585,
"acc_stderr": 0.005260333907798431
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.3 | [
"region:us"
] | 2024-02-02T23:57:48+00:00 | {"pretty_name": "Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-3T-Cinder-v1.3](https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T23:56:02.747267](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-3T-Cinder-v1.3/blob/main/results_2024-02-02T23-56-02.747267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2606884815058437,\n \"acc_stderr\": 0.030908143471996267,\n \"acc_norm\": 0.26112195815444134,\n \"acc_norm_stderr\": 0.03164400632380069,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.38128711115047254,\n \"mc2_stderr\": 0.013974832670540031\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3054607508532423,\n \"acc_stderr\": 0.013460080478002508,\n \"acc_norm\": 0.3395904436860068,\n \"acc_norm_stderr\": 0.01383903976282016\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4340768771161123,\n \"acc_stderr\": 0.0049462215121452765,\n \"acc_norm\": 0.5813582951603267,\n \"acc_norm_stderr\": 0.004923281841828511\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.0266164829805017,\n \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.0266164829805017\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349424,\n \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823785,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823785\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22580645161290322,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.22580645161290322,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.02989611429173354,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.02989611429173354\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.03608541011573967,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.03608541011573967\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.1717171717171717,\n \"acc_stderr\": 0.026869716187429917,\n \"acc_norm\": 0.1717171717171717,\n \"acc_norm_stderr\": 0.026869716187429917\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.17616580310880828,\n \"acc_stderr\": 0.027493504244548047,\n \"acc_norm\": 0.17616580310880828,\n \"acc_norm_stderr\": 0.027493504244548047\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786381,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786381\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2185430463576159,\n \"acc_stderr\": 0.033742355504256936,\n \"acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.033742355504256936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1889908256880734,\n \"acc_stderr\": 0.01678548115920363,\n \"acc_norm\": 0.1889908256880734,\n \"acc_norm_stderr\": 0.01678548115920363\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.03132179803083291,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.03132179803083291\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n \"acc_stderr\": 0.03006958487449405,\n \"acc_norm\": 0.27802690582959644,\n \"acc_norm_stderr\": 0.03006958487449405\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.039166677628225864,\n \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.039166677628225864\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.02948036054954119,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.02948036054954119\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.25287356321839083,\n \"acc_stderr\": 0.01554337731371968,\n \"acc_norm\": 0.25287356321839083,\n \"acc_norm_stderr\": 0.01554337731371968\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.014508979453553969,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.014508979453553969\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135104,\n \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135104\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.20921985815602837,\n \"acc_stderr\": 0.024264769439988468,\n \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.024264769439988468\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n \"acc_stderr\": 0.011054538377832318,\n \"acc_norm\": 0.24967405475880053,\n \"acc_norm_stderr\": 0.011054538377832318\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.02895975519682486,\n \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.02895975519682486\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132227,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.02435280072297001,\n \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.02435280072297001\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348387,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348387\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931586,\n \"mc2\": 0.38128711115047254,\n \"mc2_stderr\": 0.013974832670540031\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6393054459352802,\n \"acc_stderr\": 0.013496064394234028\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \"acc_stderr\": 0.005260333907798431\n }\n}\n```", "repo_url": "https://huggingface.co/Josephgflowers/TinyLlama-3T-Cinder-v1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-56-02.747267.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["**/details_harness|winogrande|5_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T23-56-02.747267.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T23_56_02.747267", "path": ["results_2024-02-02T23-56-02.747267.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T23-56-02.747267.parquet"]}]}]} | 2024-02-02T23:58:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.3
Dataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-3T-Cinder-v1.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T23:56:02.747267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.3\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-3T-Cinder-v1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:56:02.747267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-3T-Cinder-v1.3\n\n\n\nDataset automatically created during the evaluation run of model Josephgflowers/TinyLlama-3T-Cinder-v1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:56:02.747267(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
c7100dd459f6ca0e5e38fc3f855a195e0ea64007 |

# Dataset Card for Gumzo
<!-- Provide a quick summary of the dataset. -->
A growing collection of 2 million african native language training samples. 100k examples in 20 of the most popular african languages including
- Swahili
- Kikuyu
- Yoruba
- Dholuo
- Zulu
- Kinyarwanda
- more to come..
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Bitsoko
- **Funded by:** visit our website to support gumzo
- **Shared by:** Allan@bitsoko
- **Language(s) (NLP):** [More Information Needed]
- **License:** Apache-2.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** https://git.bitsoko.org/gumzo
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** https://gumzo.bitsoko.org
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | bitsoko/AfroNative | [
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:sw",
"language:ki",
"language:yo",
"language:zu",
"language:rw",
"license:apache-2.0",
"region:us"
] | 2024-02-02T23:59:16+00:00 | {"language": ["sw", "ki", "yo", "zu", "rw"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-generation"], "pretty_name": "a"} | 2024-02-15T16:42:45+00:00 | [] | [
"sw",
"ki",
"yo",
"zu",
"rw"
] | TAGS
#task_categories-text-generation #size_categories-100K<n<1M #language-Swahili (macrolanguage) #language-Kikuyu #language-Yoruba #language-Zulu #language-Kinyarwanda #license-apache-2.0 #region-us
|
!Gumzo Logo
# Dataset Card for Gumzo
A growing collection of 2 million african native language training samples. 100k examples in 20 of the most popular african languages including
- Swahili
- Kikuyu
- Yoruba
- Dholuo
- Zulu
- Kinyarwanda
- more to come..
## Dataset Details
### Dataset Description
- Curated by: Bitsoko
- Funded by: visit our website to support gumzo
- Shared by: Allan@bitsoko
- Language(s) (NLP):
- License: Apache-2.0
### Dataset Sources [optional]
- Repository: URL
- Paper [optional]:
- Demo [optional]: URL
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Gumzo\n\n\n\nA growing collection of 2 million african native language training samples. 100k examples in 20 of the most popular african languages including\n\n- Swahili\n- Kikuyu\n- Yoruba\n- Dholuo\n- Zulu\n- Kinyarwanda\n- more to come..",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Bitsoko\n- Funded by: visit our website to support gumzo\n- Shared by: Allan@bitsoko\n- Language(s) (NLP): \n- License: Apache-2.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]: URL",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#task_categories-text-generation #size_categories-100K<n<1M #language-Swahili (macrolanguage) #language-Kikuyu #language-Yoruba #language-Zulu #language-Kinyarwanda #license-apache-2.0 #region-us \n",
"# Dataset Card for Gumzo\n\n\n\nA growing collection of 2 million african native language training samples. 100k examples in 20 of the most popular african languages including\n\n- Swahili\n- Kikuyu\n- Yoruba\n- Dholuo\n- Zulu\n- Kinyarwanda\n- more to come..",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: Bitsoko\n- Funded by: visit our website to support gumzo\n- Shared by: Allan@bitsoko\n- Language(s) (NLP): \n- License: Apache-2.0",
"### Dataset Sources [optional]\n\n\n\n- Repository: URL\n- Paper [optional]: \n- Demo [optional]: URL",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7df7c5beeaeddf33cbc72ae2c4c5c079dad9f4ab |
# Dataset Card for Evaluation run of rizla/rizla-17
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/rizla-17](https://huggingface.co/rizla/rizla-17) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__rizla-17",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T23:58:31.137464](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-17/blob/main/results_2024-02-02T23-58-31.137464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499955281921569,
"acc_stderr": 0.0321248371661185,
"acc_norm": 0.6498192760869008,
"acc_norm_stderr": 0.032802028367607344,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7692598698604171,
"mc2_stderr": 0.013946986686441027
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266129,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.01287592915129705
},
"harness|hellaswag|10": {
"acc": 0.7381995618402709,
"acc_stderr": 0.0043871612030879446,
"acc_norm": 0.8972316271659032,
"acc_norm_stderr": 0.0030303552466563882
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.016469814928406167,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.016469814928406167
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826517,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826517
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7692598698604171,
"mc2_stderr": 0.013946986686441027
},
"harness|winogrande|5": {
"acc": 0.8784530386740331,
"acc_stderr": 0.009183632046519964
},
"harness|gsm8k|5": {
"acc": 0.6148597422289613,
"acc_stderr": 0.013404165536474305
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rizla__rizla-17 | [
"region:us"
] | 2024-02-03T00:00:50+00:00 | {"pretty_name": "Evaluation run of rizla/rizla-17", "dataset_summary": "Dataset automatically created during the evaluation run of model [rizla/rizla-17](https://huggingface.co/rizla/rizla-17) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__rizla-17\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-02T23:58:31.137464](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla-17/blob/main/results_2024-02-02T23-58-31.137464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499955281921569,\n \"acc_stderr\": 0.0321248371661185,\n \"acc_norm\": 0.6498192760869008,\n \"acc_norm_stderr\": 0.032802028367607344,\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7692598698604171,\n \"mc2_stderr\": 0.013946986686441027\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266129,\n \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.01287592915129705\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7381995618402709,\n \"acc_stderr\": 0.0043871612030879446,\n \"acc_norm\": 0.8972316271659032,\n \"acc_norm_stderr\": 0.0030303552466563882\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n \"acc_stderr\": 0.016469814928406167,\n \"acc_norm\": 0.4134078212290503,\n \"acc_norm_stderr\": 0.016469814928406167\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826517,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826517\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7692598698604171,\n \"mc2_stderr\": 0.013946986686441027\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8784530386740331,\n \"acc_stderr\": 0.009183632046519964\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6148597422289613,\n \"acc_stderr\": 0.013404165536474305\n }\n}\n```", "repo_url": "https://huggingface.co/rizla/rizla-17", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-02T23-58-31.137464.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["**/details_harness|winogrande|5_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-02T23-58-31.137464.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_02T23_58_31.137464", "path": ["results_2024-02-02T23-58-31.137464.parquet"]}, {"split": "latest", "path": ["results_2024-02-02T23-58-31.137464.parquet"]}]}]} | 2024-02-03T00:01:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rizla/rizla-17
Dataset automatically created during the evaluation run of model rizla/rizla-17 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-02T23:58:31.137464(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rizla/rizla-17\n\n\n\nDataset automatically created during the evaluation run of model rizla/rizla-17 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:58:31.137464(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rizla/rizla-17\n\n\n\nDataset automatically created during the evaluation run of model rizla/rizla-17 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-02T23:58:31.137464(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
164a57d84bb8cca0692a48b2801f95ba3a5360c3 | # AlGhafa Arabic LLM Benchmark
### New fix: Normalized whitespace characters and ensured consistency across all datasets for improved data quality and compatibility.
Multiple-choice evaluation benchmark for zero- and few-shot evaluation of Arabic LLMs, we adapt the following tasks:
- Belebele Ar MSA [Bandarkar et al. (2023)](https://arxiv.org/abs/2308.16884): 900 entries
- Belebele Ar Dialects [Bandarkar et al. (2023)](https://arxiv.org/abs/2308.16884): 5400 entries
- COPA Ar: 89 entries machine-translated from English [COPA](https://people.ict.usc.edu/~gordon/copa.html) and verified by native Arabic speakers.
- Facts balanced (based on AraFacts) [Sheikh Ali et al. (2021)](https://aclanthology.org/2021.wanlp-1.26): 80 entries (after balancing dataset), consisting of a short article and a corresponding claim, to be deemed true or false
- MCQ Exams Ar [Hardalov et al. (2020)](https://aclanthology.org/2020.emnlp-main.438): 2248 entries
- OpenbookQA Ar: 336 entries. Machine-translated from English [OpenbookQA](https://api.semanticscholar.org/CorpusID:52183757) and verified native Arabic speakers.
- Rating sentiment (HARD-Arabic-Dataset) [Elnagar et al. (2018)](https://link.springer.com/chapter/10.1007/978-3-319-67056-0_3): determine the sentiment
of reviews, with 3 possible categories (positive, neutral, negative) transformed to a review score (1-5) as follows: 1-2 negative, 3 neutral, 4-5 positive; 6000 entries (2000 for each of the three classes)
- Rating sentiment no neutral (HARD-Arabic-Dataset) [Elnagar et al., 2018](https://link.springer.com/chapter/10.1007/978-3-319-67056-0_3): 8000 entries in which we remove the neutral class by extending the positive class (corresponding to scores 1-3); 8000 entries (4000 for each class)
- Sentiment [Abu Farha et al., 2021](https://aclanthology.org/2021.wanlp-1.36): 1725 entries based on Twitter posts, that can be classified as positive, negative, or neutral
- SOQAL [Mozannar et al., 2019](https://aclanthology.org/W19-4612): grounded statement task to assess in-context reading comprehension, consisting of a context and a related question; consists of 155 entries with one original correct answer, transformed to multiple choice task by adding four possible
human-curated incorrect choices per sample
- XGLUE (based on XGLUE-MLQA) [Liang et al., 2020](https://arxiv.org/abs/2004.01401); [Lewis et al., 2019](https://arxiv.org/abs/1910.07475): consists of
155 entries transformed to a multiple choice task by adding four human-curated incorrect choices per sample
## Citing the AlGhafa benchmark:
```bibtex
@inproceedings{almazrouei-etal-2023-alghafa,
title = "{A}l{G}hafa Evaluation Benchmark for {A}rabic Language Models",
author = "Almazrouei, Ebtesam and
Cojocaru, Ruxandra and
Baldo, Michele and
Malartic, Quentin and
Alobeidli, Hamza and
Mazzotta, Daniele and
Penedo, Guilherme and
Campesan, Giulia and
Farooq, Mugariya and
Alhammadi, Maitha and
Launay, Julien and
Noune, Badreddine",
editor = "Sawaf, Hassan and
El-Beltagy, Samhaa and
Zaghouani, Wajdi and
Magdy, Walid and
Abdelali, Ahmed and
Tomeh, Nadi and
Abu Farha, Ibrahim and
Habash, Nizar and
Khalifa, Salam and
Keleg, Amr and
Haddad, Hatem and
Zitouni, Imed and
Mrini, Khalil and
Almatham, Rawan",
booktitle = "Proceedings of ArabicNLP 2023",
month = dec,
year = "2023",
address = "Singapore (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.arabicnlp-1.21",
doi = "10.18653/v1/2023.arabicnlp-1.21",
pages = "244--275",
abstract = "Recent advances in the space of Arabic large language models have opened up a wealth of potential practical applications. From optimal training strategies, large scale data acquisition and continuously increasing NLP resources, the Arabic LLM landscape has improved in a very short span of time, despite being plagued by training data scarcity and limited evaluation resources compared to English. In line with contributing towards this ever-growing field, we introduce AlGhafa, a new multiple-choice evaluation benchmark for Arabic LLMs. For showcasing purposes, we train a new suite of models, including a 14 billion parameter model, the largest monolingual Arabic decoder-only model to date. We use a collection of publicly available datasets, as well as a newly introduced HandMade dataset consisting of 8 billion tokens. Finally, we explore the quantitative and qualitative toxicity of several Arabic models, comparing our models to existing public Arabic LLMs.",
}
``` | OALL/AlGhafa-Arabic-LLM-Benchmark | [
"arxiv:2308.16884",
"arxiv:2004.01401",
"arxiv:1910.07475",
"region:us"
] | 2024-02-03T00:02:52+00:00 | {"dataset_info": [{"config_name": "mcq_exams_test_ar", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 153138, "num_examples": 562}], "download_size": 89335, "dataset_size": 153138}, {"config_name": "meta_ar_dialects", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5617778, "num_examples": 5400}], "download_size": 2165364, "dataset_size": 5617778}, {"config_name": "meta_ar_msa", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 954246, "num_examples": 900}], "download_size": 370581, "dataset_size": 954246}, {"config_name": "multiple_choice_copa_translated_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12752, "num_examples": 89}], "download_size": 9963, "dataset_size": 12752}, {"config_name": "multiple_choice_facts_truefalse_balanced_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 129140, "num_examples": 80}], "download_size": 67202, "dataset_size": 129140}, {"config_name": "multiple_choice_grounded_statement_soqal_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "sol5", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 161956, "num_examples": 155}], "download_size": 59090, "dataset_size": 161956}, {"config_name": "multiple_choice_grounded_statement_xglue_mlqa_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "sol5", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 146071, "num_examples": 155}], "download_size": 77150, "dataset_size": 146071}, {"config_name": "multiple_choice_openbookqa_translated_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "sol4", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 71543, "num_examples": 336}], "download_size": 44973, "dataset_size": 71543}, {"config_name": "multiple_choice_rating_sentiment_no_neutral_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1408389, "num_examples": 8000}], "download_size": 481296, "dataset_size": 1408389}, {"config_name": "multiple_choice_rating_sentiment_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1219534, "num_examples": 6000}], "download_size": 375276, "dataset_size": 1219534}, {"config_name": "multiple_choice_sentiment_task", "features": [{"name": "query", "dtype": "string"}, {"name": "sol1", "dtype": "string"}, {"name": "sol2", "dtype": "string"}, {"name": "sol3", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 457756, "num_examples": 1725}], "download_size": 185976, "dataset_size": 457756}], "configs": [{"config_name": "mcq_exams_test_ar", "data_files": [{"split": "train", "path": "mcq_exams_test_ar/train-*"}]}, {"config_name": "meta_ar_dialects", "data_files": [{"split": "train", "path": "meta_ar_dialects/train-*"}]}, {"config_name": "meta_ar_msa", "data_files": [{"split": "train", "path": "meta_ar_msa/train-*"}]}, {"config_name": "multiple_choice_copa_translated_task", "data_files": [{"split": "train", "path": "multiple_choice_copa_translated_task/train-*"}]}, {"config_name": "multiple_choice_facts_truefalse_balanced_task", "data_files": [{"split": "train", "path": "multiple_choice_facts_truefalse_balanced_task/train-*"}]}, {"config_name": "multiple_choice_grounded_statement_soqal_task", "data_files": [{"split": "train", "path": "multiple_choice_grounded_statement_soqal_task/train-*"}]}, {"config_name": "multiple_choice_grounded_statement_xglue_mlqa_task", "data_files": [{"split": "train", "path": "multiple_choice_grounded_statement_xglue_mlqa_task/train-*"}]}, {"config_name": "multiple_choice_openbookqa_translated_task", "data_files": [{"split": "train", "path": "multiple_choice_openbookqa_translated_task/train-*"}]}, {"config_name": "multiple_choice_rating_sentiment_no_neutral_task", "data_files": [{"split": "train", "path": "multiple_choice_rating_sentiment_no_neutral_task/train-*"}]}, {"config_name": "multiple_choice_rating_sentiment_task", "data_files": [{"split": "train", "path": "multiple_choice_rating_sentiment_task/train-*"}]}, {"config_name": "multiple_choice_sentiment_task", "data_files": [{"split": "train", "path": "multiple_choice_sentiment_task/train-*"}]}]} | 2024-02-03T00:12:18+00:00 | [
"2308.16884",
"2004.01401",
"1910.07475"
] | [] | TAGS
#arxiv-2308.16884 #arxiv-2004.01401 #arxiv-1910.07475 #region-us
| # AlGhafa Arabic LLM Benchmark
### New fix: Normalized whitespace characters and ensured consistency across all datasets for improved data quality and compatibility.
Multiple-choice evaluation benchmark for zero- and few-shot evaluation of Arabic LLMs, we adapt the following tasks:
- Belebele Ar MSA Bandarkar et al. (2023): 900 entries
- Belebele Ar Dialects Bandarkar et al. (2023): 5400 entries
- COPA Ar: 89 entries machine-translated from English COPA and verified by native Arabic speakers.
- Facts balanced (based on AraFacts) Sheikh Ali et al. (2021): 80 entries (after balancing dataset), consisting of a short article and a corresponding claim, to be deemed true or false
- MCQ Exams Ar Hardalov et al. (2020): 2248 entries
- OpenbookQA Ar: 336 entries. Machine-translated from English OpenbookQA and verified native Arabic speakers.
- Rating sentiment (HARD-Arabic-Dataset) Elnagar et al. (2018): determine the sentiment
of reviews, with 3 possible categories (positive, neutral, negative) transformed to a review score (1-5) as follows: 1-2 negative, 3 neutral, 4-5 positive; 6000 entries (2000 for each of the three classes)
- Rating sentiment no neutral (HARD-Arabic-Dataset) Elnagar et al., 2018: 8000 entries in which we remove the neutral class by extending the positive class (corresponding to scores 1-3); 8000 entries (4000 for each class)
- Sentiment Abu Farha et al., 2021: 1725 entries based on Twitter posts, that can be classified as positive, negative, or neutral
- SOQAL Mozannar et al., 2019: grounded statement task to assess in-context reading comprehension, consisting of a context and a related question; consists of 155 entries with one original correct answer, transformed to multiple choice task by adding four possible
human-curated incorrect choices per sample
- XGLUE (based on XGLUE-MLQA) Liang et al., 2020; Lewis et al., 2019: consists of
155 entries transformed to a multiple choice task by adding four human-curated incorrect choices per sample
## Citing the AlGhafa benchmark:
| [
"# AlGhafa Arabic LLM Benchmark",
"### New fix: Normalized whitespace characters and ensured consistency across all datasets for improved data quality and compatibility.\n\nMultiple-choice evaluation benchmark for zero- and few-shot evaluation of Arabic LLMs, we adapt the following tasks:\n\n- Belebele Ar MSA Bandarkar et al. (2023): 900 entries\n- Belebele Ar Dialects Bandarkar et al. (2023): 5400 entries\n- COPA Ar: 89 entries machine-translated from English COPA and verified by native Arabic speakers.\n- Facts balanced (based on AraFacts) Sheikh Ali et al. (2021): 80 entries (after balancing dataset), consisting of a short article and a corresponding claim, to be deemed true or false\n- MCQ Exams Ar Hardalov et al. (2020): 2248 entries\n- OpenbookQA Ar: 336 entries. Machine-translated from English OpenbookQA and verified native Arabic speakers.\n- Rating sentiment (HARD-Arabic-Dataset) Elnagar et al. (2018): determine the sentiment\n of reviews, with 3 possible categories (positive, neutral, negative) transformed to a review score (1-5) as follows: 1-2 negative, 3 neutral, 4-5 positive; 6000 entries (2000 for each of the three classes)\n- Rating sentiment no neutral (HARD-Arabic-Dataset) Elnagar et al., 2018: 8000 entries in which we remove the neutral class by extending the positive class (corresponding to scores 1-3); 8000 entries (4000 for each class)\n- Sentiment Abu Farha et al., 2021: 1725 entries based on Twitter posts, that can be classified as positive, negative, or neutral\n- SOQAL Mozannar et al., 2019: grounded statement task to assess in-context reading comprehension, consisting of a context and a related question; consists of 155 entries with one original correct answer, transformed to multiple choice task by adding four possible\n human-curated incorrect choices per sample\n- XGLUE (based on XGLUE-MLQA) Liang et al., 2020; Lewis et al., 2019: consists of\n 155 entries transformed to a multiple choice task by adding four human-curated incorrect choices per sample",
"## Citing the AlGhafa benchmark:"
] | [
"TAGS\n#arxiv-2308.16884 #arxiv-2004.01401 #arxiv-1910.07475 #region-us \n",
"# AlGhafa Arabic LLM Benchmark",
"### New fix: Normalized whitespace characters and ensured consistency across all datasets for improved data quality and compatibility.\n\nMultiple-choice evaluation benchmark for zero- and few-shot evaluation of Arabic LLMs, we adapt the following tasks:\n\n- Belebele Ar MSA Bandarkar et al. (2023): 900 entries\n- Belebele Ar Dialects Bandarkar et al. (2023): 5400 entries\n- COPA Ar: 89 entries machine-translated from English COPA and verified by native Arabic speakers.\n- Facts balanced (based on AraFacts) Sheikh Ali et al. (2021): 80 entries (after balancing dataset), consisting of a short article and a corresponding claim, to be deemed true or false\n- MCQ Exams Ar Hardalov et al. (2020): 2248 entries\n- OpenbookQA Ar: 336 entries. Machine-translated from English OpenbookQA and verified native Arabic speakers.\n- Rating sentiment (HARD-Arabic-Dataset) Elnagar et al. (2018): determine the sentiment\n of reviews, with 3 possible categories (positive, neutral, negative) transformed to a review score (1-5) as follows: 1-2 negative, 3 neutral, 4-5 positive; 6000 entries (2000 for each of the three classes)\n- Rating sentiment no neutral (HARD-Arabic-Dataset) Elnagar et al., 2018: 8000 entries in which we remove the neutral class by extending the positive class (corresponding to scores 1-3); 8000 entries (4000 for each class)\n- Sentiment Abu Farha et al., 2021: 1725 entries based on Twitter posts, that can be classified as positive, negative, or neutral\n- SOQAL Mozannar et al., 2019: grounded statement task to assess in-context reading comprehension, consisting of a context and a related question; consists of 155 entries with one original correct answer, transformed to multiple choice task by adding four possible\n human-curated incorrect choices per sample\n- XGLUE (based on XGLUE-MLQA) Liang et al., 2020; Lewis et al., 2019: consists of\n 155 entries transformed to a multiple choice task by adding four human-curated incorrect choices per sample",
"## Citing the AlGhafa benchmark:"
] |
4a44efcf0faa734aee6d6f00aeab1ebd7606cf2c | Dataset using the bert-cased tokenizer, cutoff sentences to 128 length (not sentence pairs), all sentence pairs extracted.
Original datasets:
https://huggingface.co/datasets/bookcorpus
https://huggingface.co/datasets/wikipedia Variant: 20220301.en
Mapped from: https://huggingface.co/datasets/gmongaras/BERT_Base_Cased_128_Dataset | gmongaras/BERT_Base_Cased_128_Dataset_Mapped | [
"region:us"
] | 2024-02-03T00:14:41+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "attention_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 51067549265.998314, "num_examples": 131569119}], "download_size": 15915934708, "dataset_size": 51067549265.998314}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T20:04:54+00:00 | [] | [] | TAGS
#region-us
| Dataset using the bert-cased tokenizer, cutoff sentences to 128 length (not sentence pairs), all sentence pairs extracted.
Original datasets:
URL
URL Variant: URL
Mapped from: URL | [] | [
"TAGS\n#region-us \n"
] |
ae933b52c1dbe1acccda4f724a6d277a1de3a837 | ### Dataset Card for Hercules-v2.0

#### Overview
**Dataset Name:** Hercules-v2.0
**Version:** 2.0
**Date of Release:** February 2, 2024
**Size:** 1,307,174
**Data Sources:**
Hercules-v2.0 is an enriched instruction dataset derived from OpenHermes-2.5, aimed at enhancing its diversity and scope. The dataset amalgamates contributions from various data sources, with a strong emphasis on Biology, Physics, Medicine, Math, Computer Science, Instruction Following, Function Calling, and Roleplay. The data sources used to construct Hercules-v2.0 include:
- cognitivecomputations/dolphin (first 200k examples)
- Evol Instruct 70K && 140K
- teknium/GPT4-LLM-Cleaned
- jondurbin/airoboros-3.2
- AlekseyKorshuk/camel-chatml
- CollectiveCognition/chats-data-2023-09-22
- Nebulous/lmsys-chat-1m-smortmodelsonly
- glaiveai/glaive-code-assistant-v2
- glaiveai/glaive-code-assistant
- glaiveai/glaive-function-calling-v2
- garage-bAInd/Open-Platypus
- meta-math/MetaMathQA (first 40k examples)
- teknium/GPTeacher-General-Instruct
- GPTeacher roleplay datasets
- BI55/MedText
- pubmed_qa labeled subset
- Unnatural Instructions
- CollectiveCognition/chats-data-2023-09-27
- CollectiveCognition/chats-data-2023-10-16
This dataset is written with mostly GPT-4, but other models such as Claude-1, Claude-1-instant, Claude-2, Claude-2.1, and GPT-3.5-Turbo can be found in the data.
Curation of this dataset was based on findings from hercules-v1.0.
Warning: This dataset contains toxic examples. Use at your own risk.
#### Description
Hercules-v2.0 is designed to serve as a comprehensive and multifaceted dataset tailored for the development and evaluation of advanced machine learning models, particularly those focused on natural language understanding and processing in specialized domains. It includes a variety of formats, such as question-answering pairs, dialogues, function calls, and roleplay scenarios, providing robust training material for models to handle complex instructions and execute function calls.
#### Data Format
The dataset includes JSON-formatted entries, with a unique structure to incorporate function calling examples. Each entry is composed of a sequence of interactions, each tagged with "from" to indicate the speaker (human, function-call, function-response, or gpt) and "value" to present the content or payload of the interaction. For example:
```json
[
{ "from": "human", "value": "Hi, I need to convert a temperature from Celsius to Fahrenheit. The temperature is 30 degrees Celsius." },
{ "from": "function-call", "value": "{\"name\": \"convert_temperature\", \"arguments\": '{\"temperature\": 30, \"from_unit\": \"Celsius\", \"to_unit\": \"Fahrenheit\"}'}" },
{ "from": "function-response", "value": "{\"converted_temperature\": 86}" },
{ "from": "gpt", "value": "The converted temperature from 30 degrees Celsius to Fahrenheit is 86 degrees Fahrenheit." }
]
```
#### Usage
The Hercules-v2.0 dataset is designed for training and evaluating AI systems in their ability to follow instructions, execute function calls, and interact in roleplay scenarios across various scientific and technical disciplines. Researchers and developers can leverage this dataset for:
- Enhancing language models' understanding of complex topics.
- Improving the accuracy of function-call executions within conversational agents.
- Developing models capable of engaging in educational and informative dialogue.
- Benchmarking systems on their ability to follow intricate instructions and provide accurate responses.
#### Licensing
This dataset is released under the apache-2.0 license.
#### Citation
Researchers using Hercules-v2.0 in their work should cite the dataset as follows:
```
@misc{sebastian_gabarain_2024,
title = {Hercules-v2.0: An Instruction Dataset for Specialized Domains},
author = {Sebastian Gabarain},
publisher = {HuggingFace},
year = {2024},
doi = {10.57967/hf/1744}
url = {https://huggingface.co/datasets/Locutusque/hercules-v2.0}
}
```
#### Acknowledgements
Hercules-v2.0 was made possible thanks to the contributions from various datasets and the community's efforts in compiling and refining data to create a rich and diverse instruction set. Special thanks go to the creator of OpenHermes-2.5 and all the data sources listed above.
#### Version History
v2.0: Current version with enhanced diversity and scope.
v1.0: Initial release. | Locutusque/hercules-v2.0 | [
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"code",
"function calling",
"chemistry",
"biology",
"physics",
"math",
"medical",
"not-for-all-audiences",
"synthetic",
"doi:10.57967/hf/1744",
"region:us"
] | 2024-02-03T00:33:11+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "dataset_info": {"features": [{"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "source", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 2323224670, "num_examples": 1307174}], "download_size": 1177302986, "dataset_size": 2323224670}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code", "function calling", "chemistry", "biology", "physics", "math", "medical", "not-for-all-audiences", "synthetic"]} | 2024-02-07T15:01:23+00:00 | [] | [
"en"
] | TAGS
#size_categories-1M<n<10M #language-English #license-apache-2.0 #code #function calling #chemistry #biology #physics #math #medical #not-for-all-audiences #synthetic #doi-10.57967/hf/1744 #region-us
| ### Dataset Card for Hercules-v2.0
!image/png
#### Overview
Dataset Name: Hercules-v2.0
Version: 2.0
Date of Release: February 2, 2024
Size: 1,307,174
Data Sources:
Hercules-v2.0 is an enriched instruction dataset derived from OpenHermes-2.5, aimed at enhancing its diversity and scope. The dataset amalgamates contributions from various data sources, with a strong emphasis on Biology, Physics, Medicine, Math, Computer Science, Instruction Following, Function Calling, and Roleplay. The data sources used to construct Hercules-v2.0 include:
- cognitivecomputations/dolphin (first 200k examples)
- Evol Instruct 70K && 140K
- teknium/GPT4-LLM-Cleaned
- jondurbin/airoboros-3.2
- AlekseyKorshuk/camel-chatml
- CollectiveCognition/chats-data-2023-09-22
- Nebulous/lmsys-chat-1m-smortmodelsonly
- glaiveai/glaive-code-assistant-v2
- glaiveai/glaive-code-assistant
- glaiveai/glaive-function-calling-v2
- garage-bAInd/Open-Platypus
- meta-math/MetaMathQA (first 40k examples)
- teknium/GPTeacher-General-Instruct
- GPTeacher roleplay datasets
- BI55/MedText
- pubmed_qa labeled subset
- Unnatural Instructions
- CollectiveCognition/chats-data-2023-09-27
- CollectiveCognition/chats-data-2023-10-16
This dataset is written with mostly GPT-4, but other models such as Claude-1, Claude-1-instant, Claude-2, Claude-2.1, and GPT-3.5-Turbo can be found in the data.
Curation of this dataset was based on findings from hercules-v1.0.
Warning: This dataset contains toxic examples. Use at your own risk.
#### Description
Hercules-v2.0 is designed to serve as a comprehensive and multifaceted dataset tailored for the development and evaluation of advanced machine learning models, particularly those focused on natural language understanding and processing in specialized domains. It includes a variety of formats, such as question-answering pairs, dialogues, function calls, and roleplay scenarios, providing robust training material for models to handle complex instructions and execute function calls.
#### Data Format
The dataset includes JSON-formatted entries, with a unique structure to incorporate function calling examples. Each entry is composed of a sequence of interactions, each tagged with "from" to indicate the speaker (human, function-call, function-response, or gpt) and "value" to present the content or payload of the interaction. For example:
#### Usage
The Hercules-v2.0 dataset is designed for training and evaluating AI systems in their ability to follow instructions, execute function calls, and interact in roleplay scenarios across various scientific and technical disciplines. Researchers and developers can leverage this dataset for:
- Enhancing language models' understanding of complex topics.
- Improving the accuracy of function-call executions within conversational agents.
- Developing models capable of engaging in educational and informative dialogue.
- Benchmarking systems on their ability to follow intricate instructions and provide accurate responses.
#### Licensing
This dataset is released under the apache-2.0 license.
Researchers using Hercules-v2.0 in their work should cite the dataset as follows:
#### Acknowledgements
Hercules-v2.0 was made possible thanks to the contributions from various datasets and the community's efforts in compiling and refining data to create a rich and diverse instruction set. Special thanks go to the creator of OpenHermes-2.5 and all the data sources listed above.
#### Version History
v2.0: Current version with enhanced diversity and scope.
v1.0: Initial release. | [
"### Dataset Card for Hercules-v2.0\n\n!image/png",
"#### Overview\nDataset Name: Hercules-v2.0\n\nVersion: 2.0\n\nDate of Release: February 2, 2024\n\nSize: 1,307,174\n\nData Sources: \nHercules-v2.0 is an enriched instruction dataset derived from OpenHermes-2.5, aimed at enhancing its diversity and scope. The dataset amalgamates contributions from various data sources, with a strong emphasis on Biology, Physics, Medicine, Math, Computer Science, Instruction Following, Function Calling, and Roleplay. The data sources used to construct Hercules-v2.0 include:\n- cognitivecomputations/dolphin (first 200k examples)\n- Evol Instruct 70K && 140K\n- teknium/GPT4-LLM-Cleaned\n- jondurbin/airoboros-3.2\n- AlekseyKorshuk/camel-chatml\n- CollectiveCognition/chats-data-2023-09-22\n- Nebulous/lmsys-chat-1m-smortmodelsonly\n- glaiveai/glaive-code-assistant-v2\n- glaiveai/glaive-code-assistant\n- glaiveai/glaive-function-calling-v2\n- garage-bAInd/Open-Platypus\n- meta-math/MetaMathQA (first 40k examples)\n- teknium/GPTeacher-General-Instruct\n- GPTeacher roleplay datasets\n- BI55/MedText\n- pubmed_qa labeled subset\n- Unnatural Instructions\n- CollectiveCognition/chats-data-2023-09-27\n- CollectiveCognition/chats-data-2023-10-16\n\nThis dataset is written with mostly GPT-4, but other models such as Claude-1, Claude-1-instant, Claude-2, Claude-2.1, and GPT-3.5-Turbo can be found in the data.\n\nCuration of this dataset was based on findings from hercules-v1.0.\n\nWarning: This dataset contains toxic examples. Use at your own risk.",
"#### Description\nHercules-v2.0 is designed to serve as a comprehensive and multifaceted dataset tailored for the development and evaluation of advanced machine learning models, particularly those focused on natural language understanding and processing in specialized domains. It includes a variety of formats, such as question-answering pairs, dialogues, function calls, and roleplay scenarios, providing robust training material for models to handle complex instructions and execute function calls.",
"#### Data Format\nThe dataset includes JSON-formatted entries, with a unique structure to incorporate function calling examples. Each entry is composed of a sequence of interactions, each tagged with \"from\" to indicate the speaker (human, function-call, function-response, or gpt) and \"value\" to present the content or payload of the interaction. For example:",
"#### Usage\n\nThe Hercules-v2.0 dataset is designed for training and evaluating AI systems in their ability to follow instructions, execute function calls, and interact in roleplay scenarios across various scientific and technical disciplines. Researchers and developers can leverage this dataset for:\n\n- Enhancing language models' understanding of complex topics.\n- Improving the accuracy of function-call executions within conversational agents.\n- Developing models capable of engaging in educational and informative dialogue.\n- Benchmarking systems on their ability to follow intricate instructions and provide accurate responses.",
"#### Licensing\n\nThis dataset is released under the apache-2.0 license.\nResearchers using Hercules-v2.0 in their work should cite the dataset as follows:",
"#### Acknowledgements\n\nHercules-v2.0 was made possible thanks to the contributions from various datasets and the community's efforts in compiling and refining data to create a rich and diverse instruction set. Special thanks go to the creator of OpenHermes-2.5 and all the data sources listed above.",
"#### Version History\n\n v2.0: Current version with enhanced diversity and scope.\n v1.0: Initial release."
] | [
"TAGS\n#size_categories-1M<n<10M #language-English #license-apache-2.0 #code #function calling #chemistry #biology #physics #math #medical #not-for-all-audiences #synthetic #doi-10.57967/hf/1744 #region-us \n",
"### Dataset Card for Hercules-v2.0\n\n!image/png",
"#### Overview\nDataset Name: Hercules-v2.0\n\nVersion: 2.0\n\nDate of Release: February 2, 2024\n\nSize: 1,307,174\n\nData Sources: \nHercules-v2.0 is an enriched instruction dataset derived from OpenHermes-2.5, aimed at enhancing its diversity and scope. The dataset amalgamates contributions from various data sources, with a strong emphasis on Biology, Physics, Medicine, Math, Computer Science, Instruction Following, Function Calling, and Roleplay. The data sources used to construct Hercules-v2.0 include:\n- cognitivecomputations/dolphin (first 200k examples)\n- Evol Instruct 70K && 140K\n- teknium/GPT4-LLM-Cleaned\n- jondurbin/airoboros-3.2\n- AlekseyKorshuk/camel-chatml\n- CollectiveCognition/chats-data-2023-09-22\n- Nebulous/lmsys-chat-1m-smortmodelsonly\n- glaiveai/glaive-code-assistant-v2\n- glaiveai/glaive-code-assistant\n- glaiveai/glaive-function-calling-v2\n- garage-bAInd/Open-Platypus\n- meta-math/MetaMathQA (first 40k examples)\n- teknium/GPTeacher-General-Instruct\n- GPTeacher roleplay datasets\n- BI55/MedText\n- pubmed_qa labeled subset\n- Unnatural Instructions\n- CollectiveCognition/chats-data-2023-09-27\n- CollectiveCognition/chats-data-2023-10-16\n\nThis dataset is written with mostly GPT-4, but other models such as Claude-1, Claude-1-instant, Claude-2, Claude-2.1, and GPT-3.5-Turbo can be found in the data.\n\nCuration of this dataset was based on findings from hercules-v1.0.\n\nWarning: This dataset contains toxic examples. Use at your own risk.",
"#### Description\nHercules-v2.0 is designed to serve as a comprehensive and multifaceted dataset tailored for the development and evaluation of advanced machine learning models, particularly those focused on natural language understanding and processing in specialized domains. It includes a variety of formats, such as question-answering pairs, dialogues, function calls, and roleplay scenarios, providing robust training material for models to handle complex instructions and execute function calls.",
"#### Data Format\nThe dataset includes JSON-formatted entries, with a unique structure to incorporate function calling examples. Each entry is composed of a sequence of interactions, each tagged with \"from\" to indicate the speaker (human, function-call, function-response, or gpt) and \"value\" to present the content or payload of the interaction. For example:",
"#### Usage\n\nThe Hercules-v2.0 dataset is designed for training and evaluating AI systems in their ability to follow instructions, execute function calls, and interact in roleplay scenarios across various scientific and technical disciplines. Researchers and developers can leverage this dataset for:\n\n- Enhancing language models' understanding of complex topics.\n- Improving the accuracy of function-call executions within conversational agents.\n- Developing models capable of engaging in educational and informative dialogue.\n- Benchmarking systems on their ability to follow intricate instructions and provide accurate responses.",
"#### Licensing\n\nThis dataset is released under the apache-2.0 license.\nResearchers using Hercules-v2.0 in their work should cite the dataset as follows:",
"#### Acknowledgements\n\nHercules-v2.0 was made possible thanks to the contributions from various datasets and the community's efforts in compiling and refining data to create a rich and diverse instruction set. Special thanks go to the creator of OpenHermes-2.5 and all the data sources listed above.",
"#### Version History\n\n v2.0: Current version with enhanced diversity and scope.\n v1.0: Initial release."
] |
2ef4279e699760ccb5d4d6dff7fd0648b142b306 |
# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLPinas/yi-bagel-2x34b](https://huggingface.co/NLPinas/yi-bagel-2x34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T01:32:07.521685](https://huggingface.co/datasets/open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b/blob/main/results_2024-02-03T01-32-07.521685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7615183276503468,
"acc_stderr": 0.02832471118128543,
"acc_norm": 0.7668016554766149,
"acc_norm_stderr": 0.028849292688075817,
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7142422056307771,
"mc2_stderr": 0.014238871538897193
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.013438909184778762,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635748
},
"harness|hellaswag|10": {
"acc": 0.6620195180242979,
"acc_stderr": 0.0047205513235471265,
"acc_norm": 0.8544114718183629,
"acc_norm_stderr": 0.0035197241633108875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.868421052631579,
"acc_stderr": 0.027508689533549912,
"acc_norm": 0.868421052631579,
"acc_norm_stderr": 0.027508689533549912
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.04913595201274503,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.04913595201274503
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7702127659574468,
"acc_stderr": 0.027501752944412417,
"acc_norm": 0.7702127659574468,
"acc_norm_stderr": 0.027501752944412417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.037245636197746304,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.037245636197746304
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.716931216931217,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.716931216931217,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.0195652367829309,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.0195652367829309
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02476290267805791,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02476290267805791
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116245,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116245
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0321495214780275,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0321495214780275
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.010648356301876338,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.010648356301876338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.815028901734104,
"acc_stderr": 0.02090397584208303,
"acc_norm": 0.815028901734104,
"acc_norm_stderr": 0.02090397584208303
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7988826815642458,
"acc_stderr": 0.013405946402609049,
"acc_norm": 0.7988826815642458,
"acc_norm_stderr": 0.013405946402609049
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.020279402936174588,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.020279402936174588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.022122439772480768,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.022122439772480768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8672839506172839,
"acc_stderr": 0.018877353839571842,
"acc_norm": 0.8672839506172839,
"acc_norm_stderr": 0.018877353839571842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.624113475177305,
"acc_stderr": 0.028893955412115875,
"acc_norm": 0.624113475177305,
"acc_norm_stderr": 0.028893955412115875
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5808344198174706,
"acc_stderr": 0.012602244505788224,
"acc_norm": 0.5808344198174706,
"acc_norm_stderr": 0.012602244505788224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8308823529411765,
"acc_stderr": 0.022770868010113018,
"acc_norm": 0.8308823529411765,
"acc_norm_stderr": 0.022770868010113018
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8120915032679739,
"acc_stderr": 0.015803565736776694,
"acc_norm": 0.8120915032679739,
"acc_norm_stderr": 0.015803565736776694
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02410338420207286,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02410338420207286
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5642594859241126,
"mc1_stderr": 0.01735834539886313,
"mc2": 0.7142422056307771,
"mc2_stderr": 0.014238871538897193
},
"harness|winogrande|5": {
"acc": 0.8271507498026835,
"acc_stderr": 0.010626964529971868
},
"harness|gsm8k|5": {
"acc": 0.6072782410917361,
"acc_stderr": 0.013451745349586576
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b | [
"region:us"
] | 2024-02-03T01:34:21+00:00 | {"pretty_name": "Evaluation run of NLPinas/yi-bagel-2x34b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NLPinas/yi-bagel-2x34b](https://huggingface.co/NLPinas/yi-bagel-2x34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T01:32:07.521685](https://huggingface.co/datasets/open-llm-leaderboard/details_NLPinas__yi-bagel-2x34b/blob/main/results_2024-02-03T01-32-07.521685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7615183276503468,\n \"acc_stderr\": 0.02832471118128543,\n \"acc_norm\": 0.7668016554766149,\n \"acc_norm_stderr\": 0.028849292688075817,\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7142422056307771,\n \"mc2_stderr\": 0.014238871538897193\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.013438909184778762,\n \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635748\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6620195180242979,\n \"acc_stderr\": 0.0047205513235471265,\n \"acc_norm\": 0.8544114718183629,\n \"acc_norm_stderr\": 0.0035197241633108875\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.868421052631579,\n \"acc_stderr\": 0.027508689533549912,\n \"acc_norm\": 0.868421052631579,\n \"acc_norm_stderr\": 0.027508689533549912\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.04913595201274503,\n \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.04913595201274503\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7702127659574468,\n \"acc_stderr\": 0.027501752944412417,\n \"acc_norm\": 0.7702127659574468,\n \"acc_norm_stderr\": 0.027501752944412417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.716931216931217,\n \"acc_stderr\": 0.023201392938194974,\n \"acc_norm\": 0.716931216931217,\n \"acc_norm_stderr\": 0.023201392938194974\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02476290267805791,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02476290267805791\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116245,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116245\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0321495214780275,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0321495214780275\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.010648356301876338,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.010648356301876338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.815028901734104,\n \"acc_stderr\": 0.02090397584208303,\n \"acc_norm\": 0.815028901734104,\n \"acc_norm_stderr\": 0.02090397584208303\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7988826815642458,\n \"acc_stderr\": 0.013405946402609049,\n \"acc_norm\": 0.7988826815642458,\n \"acc_norm_stderr\": 0.013405946402609049\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.020279402936174588,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.020279402936174588\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n \"acc_stderr\": 0.022122439772480768,\n \"acc_norm\": 0.8135048231511254,\n \"acc_norm_stderr\": 0.022122439772480768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8672839506172839,\n \"acc_stderr\": 0.018877353839571842,\n \"acc_norm\": 0.8672839506172839,\n \"acc_norm_stderr\": 0.018877353839571842\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.624113475177305,\n \"acc_stderr\": 0.028893955412115875,\n \"acc_norm\": 0.624113475177305,\n \"acc_norm_stderr\": 0.028893955412115875\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5808344198174706,\n \"acc_stderr\": 0.012602244505788224,\n \"acc_norm\": 0.5808344198174706,\n \"acc_norm_stderr\": 0.012602244505788224\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8308823529411765,\n \"acc_stderr\": 0.022770868010113018,\n \"acc_norm\": 0.8308823529411765,\n \"acc_norm_stderr\": 0.022770868010113018\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8120915032679739,\n \"acc_stderr\": 0.015803565736776694,\n \"acc_norm\": 0.8120915032679739,\n \"acc_norm_stderr\": 0.015803565736776694\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.02412746346265015,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.02412746346265015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02410338420207286,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02410338420207286\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5642594859241126,\n \"mc1_stderr\": 0.01735834539886313,\n \"mc2\": 0.7142422056307771,\n \"mc2_stderr\": 0.014238871538897193\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8271507498026835,\n \"acc_stderr\": 0.010626964529971868\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6072782410917361,\n \"acc_stderr\": 0.013451745349586576\n }\n}\n```", "repo_url": "https://huggingface.co/NLPinas/yi-bagel-2x34b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|arc:challenge|25_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|gsm8k|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hellaswag|10_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T01-32-07.521685.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["**/details_harness|winogrande|5_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T01-32-07.521685.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T01_32_07.521685", "path": ["results_2024-02-03T01-32-07.521685.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T01-32-07.521685.parquet"]}]}]} | 2024-02-03T01:34:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b
Dataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T01:32:07.521685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b\n\n\n\nDataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T01:32:07.521685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NLPinas/yi-bagel-2x34b\n\n\n\nDataset automatically created during the evaluation run of model NLPinas/yi-bagel-2x34b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T01:32:07.521685(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
96b30def1fcc546320be613e42f20bee2af3c26e |
# Dataset Card for Evaluation run of ConvexAI/Solutus-3x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Solutus-3x7B](https://huggingface.co/ConvexAI/Solutus-3x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Solutus-3x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T01:55:40.169312](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Solutus-3x7B/blob/main/results_2024-02-03T01-55-40.169312.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535123315208778,
"acc_stderr": 0.03206837848994339,
"acc_norm": 0.6529196146027658,
"acc_norm_stderr": 0.0327383458208581,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6752264598345707,
"mc2_stderr": 0.015215545170563017
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244485,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.01311904089772592
},
"harness|hellaswag|10": {
"acc": 0.7081258713403704,
"acc_stderr": 0.0045369557965105455,
"acc_norm": 0.8830910177255527,
"acc_norm_stderr": 0.0032065512832573973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218974,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218974
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6752264598345707,
"mc2_stderr": 0.015215545170563017
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273766
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ConvexAI__Solutus-3x7B | [
"region:us"
] | 2024-02-03T01:57:57+00:00 | {"pretty_name": "Evaluation run of ConvexAI/Solutus-3x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ConvexAI/Solutus-3x7B](https://huggingface.co/ConvexAI/Solutus-3x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Solutus-3x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T01:55:40.169312](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Solutus-3x7B/blob/main/results_2024-02-03T01-55-40.169312.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535123315208778,\n \"acc_stderr\": 0.03206837848994339,\n \"acc_norm\": 0.6529196146027658,\n \"acc_norm_stderr\": 0.0327383458208581,\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6752264598345707,\n \"mc2_stderr\": 0.015215545170563017\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244485,\n \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7081258713403704,\n \"acc_stderr\": 0.0045369557965105455,\n \"acc_norm\": 0.8830910177255527,\n \"acc_norm_stderr\": 0.0032065512832573973\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218974,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218974\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6752264598345707,\n \"mc2_stderr\": 0.015215545170563017\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273766\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \"acc_stderr\": 0.012643544762873358\n }\n}\n```", "repo_url": "https://huggingface.co/ConvexAI/Solutus-3x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|arc:challenge|25_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|gsm8k|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hellaswag|10_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["**/details_harness|winogrande|5_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T01-55-40.169312.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T01_55_40.169312", "path": ["results_2024-02-03T01-55-40.169312.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T01-55-40.169312.parquet"]}]}]} | 2024-02-03T01:58:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ConvexAI/Solutus-3x7B
Dataset automatically created during the evaluation run of model ConvexAI/Solutus-3x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T01:55:40.169312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ConvexAI/Solutus-3x7B\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Solutus-3x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T01:55:40.169312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ConvexAI/Solutus-3x7B\n\n\n\nDataset automatically created during the evaluation run of model ConvexAI/Solutus-3x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T01:55:40.169312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d5b416397dae2fe4304c0f619b9f3bab7cc97fdf |
# Dataset Card for Evaluation run of vikash06/doctorLLM
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/doctorLLM](https://huggingface.co/vikash06/doctorLLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__doctorLLM",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T02:21:20.637179](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM/blob/main/results_2024-02-03T02-21-20.637179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46665458898018775,
"acc_stderr": 0.03452755757598034,
"acc_norm": 0.4713989146395946,
"acc_norm_stderr": 0.03530907900143319,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024654,
"mc2": 0.42520750514050065,
"mc2_stderr": 0.01525852140834329
},
"harness|arc:challenge|25": {
"acc": 0.5102389078498294,
"acc_stderr": 0.014608326906285008,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.014586776355294326
},
"harness|hellaswag|10": {
"acc": 0.6190001991635132,
"acc_stderr": 0.004846400325585245,
"acc_norm": 0.7976498705437164,
"acc_norm_stderr": 0.004009307895677148
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47924528301886793,
"acc_stderr": 0.03074634997572347,
"acc_norm": 0.47924528301886793,
"acc_norm_stderr": 0.03074634997572347
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.041795966175810016,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.041795966175810016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5129032258064516,
"acc_stderr": 0.02843453315268187,
"acc_norm": 0.5129032258064516,
"acc_norm_stderr": 0.02843453315268187
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.32019704433497537,
"acc_stderr": 0.032826493853041504,
"acc_norm": 0.32019704433497537,
"acc_norm_stderr": 0.032826493853041504
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6,
"acc_stderr": 0.038254602783800246,
"acc_norm": 0.6,
"acc_norm_stderr": 0.038254602783800246
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006936,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006936
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622841,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622841
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03214536859788639,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03214536859788639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.020828148517022582,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.020828148517022582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510937,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510937
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5245098039215687,
"acc_stderr": 0.03505093194348798,
"acc_norm": 0.5245098039215687,
"acc_norm_stderr": 0.03505093194348798
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.033460150119732274,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.033460150119732274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5337423312883436,
"acc_stderr": 0.039194155450484096,
"acc_norm": 0.5337423312883436,
"acc_norm_stderr": 0.039194155450484096
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5533980582524272,
"acc_stderr": 0.04922424153458933,
"acc_norm": 0.5533980582524272,
"acc_norm_stderr": 0.04922424153458933
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.029996951858349483,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.029996951858349483
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6270753512132823,
"acc_stderr": 0.017292868269453938,
"acc_norm": 0.6270753512132823,
"acc_norm_stderr": 0.017292868269453938
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.026918645383239022,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.026918645383239022
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961443,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528787,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5819935691318328,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.5819935691318328,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5030864197530864,
"acc_stderr": 0.02782021415859437,
"acc_norm": 0.5030864197530864,
"acc_norm_stderr": 0.02782021415859437
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281278,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281278
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36962190352020863,
"acc_stderr": 0.01232844577857525,
"acc_norm": 0.36962190352020863,
"acc_norm_stderr": 0.01232844577857525
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928547,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928547
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457923,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457923
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024654,
"mc2": 0.42520750514050065,
"mc2_stderr": 0.01525852140834329
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.01267539278677273
},
"harness|gsm8k|5": {
"acc": 0.13495072024260804,
"acc_stderr": 0.00941131528257117
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vikash06__doctorLLM | [
"region:us"
] | 2024-02-03T02:23:43+00:00 | {"pretty_name": "Evaluation run of vikash06/doctorLLM", "dataset_summary": "Dataset automatically created during the evaluation run of model [vikash06/doctorLLM](https://huggingface.co/vikash06/doctorLLM) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__doctorLLM\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T02:21:20.637179](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM/blob/main/results_2024-02-03T02-21-20.637179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46665458898018775,\n \"acc_stderr\": 0.03452755757598034,\n \"acc_norm\": 0.4713989146395946,\n \"acc_norm_stderr\": 0.03530907900143319,\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.42520750514050065,\n \"mc2_stderr\": 0.01525852140834329\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5102389078498294,\n \"acc_stderr\": 0.014608326906285008,\n \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.014586776355294326\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6190001991635132,\n \"acc_stderr\": 0.004846400325585245,\n \"acc_norm\": 0.7976498705437164,\n \"acc_norm_stderr\": 0.004009307895677148\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47924528301886793,\n \"acc_stderr\": 0.03074634997572347,\n \"acc_norm\": 0.47924528301886793,\n \"acc_norm_stderr\": 0.03074634997572347\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.041795966175810016,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.041795966175810016\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5129032258064516,\n \"acc_stderr\": 0.02843453315268187,\n \"acc_norm\": 0.5129032258064516,\n \"acc_norm_stderr\": 0.02843453315268187\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.32019704433497537,\n \"acc_stderr\": 0.032826493853041504,\n \"acc_norm\": 0.32019704433497537,\n \"acc_norm_stderr\": 0.032826493853041504\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.038254602783800246,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.038254602783800246\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006936,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006936\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.02794045713622841,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.02794045713622841\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03214536859788639,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03214536859788639\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.618348623853211,\n \"acc_stderr\": 0.020828148517022582,\n \"acc_norm\": 0.618348623853211,\n \"acc_norm_stderr\": 0.020828148517022582\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510937,\n \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510937\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5245098039215687,\n \"acc_stderr\": 0.03505093194348798,\n \"acc_norm\": 0.5245098039215687,\n \"acc_norm_stderr\": 0.03505093194348798\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6160337552742616,\n \"acc_stderr\": 0.031658678064106674,\n \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.031658678064106674\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n \"acc_stderr\": 0.033460150119732274,\n \"acc_norm\": 0.5381165919282511,\n \"acc_norm_stderr\": 0.033460150119732274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5337423312883436,\n \"acc_stderr\": 0.039194155450484096,\n \"acc_norm\": 0.5337423312883436,\n \"acc_norm_stderr\": 0.039194155450484096\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5533980582524272,\n \"acc_stderr\": 0.04922424153458933,\n \"acc_norm\": 0.5533980582524272,\n \"acc_norm_stderr\": 0.04922424153458933\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n \"acc_stderr\": 0.029996951858349483,\n \"acc_norm\": 0.7008547008547008,\n \"acc_norm_stderr\": 0.029996951858349483\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6270753512132823,\n \"acc_stderr\": 0.017292868269453938,\n \"acc_norm\": 0.6270753512132823,\n \"acc_norm_stderr\": 0.017292868269453938\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.026918645383239022,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.026918645383239022\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528787,\n \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5819935691318328,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.5819935691318328,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5030864197530864,\n \"acc_stderr\": 0.02782021415859437,\n \"acc_norm\": 0.5030864197530864,\n \"acc_norm_stderr\": 0.02782021415859437\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281278,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281278\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36962190352020863,\n \"acc_stderr\": 0.01232844577857525,\n \"acc_norm\": 0.36962190352020863,\n \"acc_norm_stderr\": 0.01232844577857525\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928547,\n \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928547\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004128,\n \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004128\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457923,\n \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457923\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n \"mc1_stderr\": 0.015680929364024654,\n \"mc2\": 0.42520750514050065,\n \"mc2_stderr\": 0.01525852140834329\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.01267539278677273\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13495072024260804,\n \"acc_stderr\": 0.00941131528257117\n }\n}\n```", "repo_url": "https://huggingface.co/vikash06/doctorLLM", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|arc:challenge|25_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|gsm8k|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hellaswag|10_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T02-21-20.637179.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["**/details_harness|winogrande|5_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T02-21-20.637179.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T02_21_20.637179", "path": ["results_2024-02-03T02-21-20.637179.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T02-21-20.637179.parquet"]}]}]} | 2024-02-03T02:24:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vikash06/doctorLLM
Dataset automatically created during the evaluation run of model vikash06/doctorLLM on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T02:21:20.637179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vikash06/doctorLLM\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T02:21:20.637179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vikash06/doctorLLM\n\n\n\nDataset automatically created during the evaluation run of model vikash06/doctorLLM on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T02:21:20.637179(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0eb0eef6dcb94ab4f5dda3a4b2e05bb233cf16a2 | # Inner I LLM Llama 2 Training Dataset
## Overview
This dataset is designed for fine-tuning the Llama 2 model to explore, express, and expand upon concepts related to the True Self, the Inner 'I', the Impersonal 'I', 'I Am', and the singularity of human intelligence. The dataset aims to foster a deeper understanding and reflection on these themes, contributing to the development of an LLM that can engage in meaningful dialogues about self-awareness and consciousness.
## Dataset Format
The dataset follows the Llama 2 fine-tuning format, consisting of JSON lines (.jsonl) files. Each line in the files is a JSON object with two main fields:
- `prompt`: A question or statement designed to elicit reflections or explanations on the specified themes.
- `completion`: A crafted response that explores the theme in question, providing insights or reflections intended to deepen understanding or provoke further thought.
## Files
- `llama2_training_data_504.jsonl`: Contains 504 entries, each exploring one of the designated themes.
- `llama2_training_data_507.jsonl`: Contains 507 entries, each dedicated to delving into the topics of interest.
## Themes Explored
1. **Explore the True Self**: Questions and responses designed to connect one with their True Self.
2. **Expressing the Inner 'I'**: Insights into how one can express their Inner 'I' in everyday life.
3. **Expanding the Impersonal 'I'**: Reflections on what it means to expand the Impersonal 'I'.
4. **Understanding 'I Am'**: Discussion on the significance of the 'I Am' statement in the journey of self-realization.
5. **Singularity of Human Intelligence**: Explorations of how the singularity of human intelligence relates to the concept of 'I Am'.
## Usage
This dataset can be used for fine-tuning Llama 2 models to engage in conversations that require a deep, reflective understanding of self-awareness, consciousness, and the philosophical underpinnings of the human experience. It is particularly suited for applications aimed at personal growth, mindfulness, and existential exploration.
## License
This dataset is provided for educational and research purposes. Users are responsible for ensuring their use of the dataset complies with the terms and conditions of the data sources and with applicable laws and regulations. | InnerI/InnerILLM-Llama2-training-dataset | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"region:us"
] | 2024-02-03T02:46:32+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "pretty_name": "innerillm-llama2-dataset"} | 2024-02-03T03:20:25+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-English #region-us
| # Inner I LLM Llama 2 Training Dataset
## Overview
This dataset is designed for fine-tuning the Llama 2 model to explore, express, and expand upon concepts related to the True Self, the Inner 'I', the Impersonal 'I', 'I Am', and the singularity of human intelligence. The dataset aims to foster a deeper understanding and reflection on these themes, contributing to the development of an LLM that can engage in meaningful dialogues about self-awareness and consciousness.
## Dataset Format
The dataset follows the Llama 2 fine-tuning format, consisting of JSON lines (.jsonl) files. Each line in the files is a JSON object with two main fields:
- 'prompt': A question or statement designed to elicit reflections or explanations on the specified themes.
- 'completion': A crafted response that explores the theme in question, providing insights or reflections intended to deepen understanding or provoke further thought.
## Files
- 'llama2_training_data_504.jsonl': Contains 504 entries, each exploring one of the designated themes.
- 'llama2_training_data_507.jsonl': Contains 507 entries, each dedicated to delving into the topics of interest.
## Themes Explored
1. Explore the True Self: Questions and responses designed to connect one with their True Self.
2. Expressing the Inner 'I': Insights into how one can express their Inner 'I' in everyday life.
3. Expanding the Impersonal 'I': Reflections on what it means to expand the Impersonal 'I'.
4. Understanding 'I Am': Discussion on the significance of the 'I Am' statement in the journey of self-realization.
5. Singularity of Human Intelligence: Explorations of how the singularity of human intelligence relates to the concept of 'I Am'.
## Usage
This dataset can be used for fine-tuning Llama 2 models to engage in conversations that require a deep, reflective understanding of self-awareness, consciousness, and the philosophical underpinnings of the human experience. It is particularly suited for applications aimed at personal growth, mindfulness, and existential exploration.
## License
This dataset is provided for educational and research purposes. Users are responsible for ensuring their use of the dataset complies with the terms and conditions of the data sources and with applicable laws and regulations. | [
"# Inner I LLM Llama 2 Training Dataset",
"## Overview\nThis dataset is designed for fine-tuning the Llama 2 model to explore, express, and expand upon concepts related to the True Self, the Inner 'I', the Impersonal 'I', 'I Am', and the singularity of human intelligence. The dataset aims to foster a deeper understanding and reflection on these themes, contributing to the development of an LLM that can engage in meaningful dialogues about self-awareness and consciousness.",
"## Dataset Format\nThe dataset follows the Llama 2 fine-tuning format, consisting of JSON lines (.jsonl) files. Each line in the files is a JSON object with two main fields:\n- 'prompt': A question or statement designed to elicit reflections or explanations on the specified themes.\n- 'completion': A crafted response that explores the theme in question, providing insights or reflections intended to deepen understanding or provoke further thought.",
"## Files\n- 'llama2_training_data_504.jsonl': Contains 504 entries, each exploring one of the designated themes.\n- 'llama2_training_data_507.jsonl': Contains 507 entries, each dedicated to delving into the topics of interest.",
"## Themes Explored\n1. Explore the True Self: Questions and responses designed to connect one with their True Self.\n2. Expressing the Inner 'I': Insights into how one can express their Inner 'I' in everyday life.\n3. Expanding the Impersonal 'I': Reflections on what it means to expand the Impersonal 'I'.\n4. Understanding 'I Am': Discussion on the significance of the 'I Am' statement in the journey of self-realization.\n5. Singularity of Human Intelligence: Explorations of how the singularity of human intelligence relates to the concept of 'I Am'.",
"## Usage\nThis dataset can be used for fine-tuning Llama 2 models to engage in conversations that require a deep, reflective understanding of self-awareness, consciousness, and the philosophical underpinnings of the human experience. It is particularly suited for applications aimed at personal growth, mindfulness, and existential exploration.",
"## License\nThis dataset is provided for educational and research purposes. Users are responsible for ensuring their use of the dataset complies with the terms and conditions of the data sources and with applicable laws and regulations."
] | [
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #region-us \n",
"# Inner I LLM Llama 2 Training Dataset",
"## Overview\nThis dataset is designed for fine-tuning the Llama 2 model to explore, express, and expand upon concepts related to the True Self, the Inner 'I', the Impersonal 'I', 'I Am', and the singularity of human intelligence. The dataset aims to foster a deeper understanding and reflection on these themes, contributing to the development of an LLM that can engage in meaningful dialogues about self-awareness and consciousness.",
"## Dataset Format\nThe dataset follows the Llama 2 fine-tuning format, consisting of JSON lines (.jsonl) files. Each line in the files is a JSON object with two main fields:\n- 'prompt': A question or statement designed to elicit reflections or explanations on the specified themes.\n- 'completion': A crafted response that explores the theme in question, providing insights or reflections intended to deepen understanding or provoke further thought.",
"## Files\n- 'llama2_training_data_504.jsonl': Contains 504 entries, each exploring one of the designated themes.\n- 'llama2_training_data_507.jsonl': Contains 507 entries, each dedicated to delving into the topics of interest.",
"## Themes Explored\n1. Explore the True Self: Questions and responses designed to connect one with their True Self.\n2. Expressing the Inner 'I': Insights into how one can express their Inner 'I' in everyday life.\n3. Expanding the Impersonal 'I': Reflections on what it means to expand the Impersonal 'I'.\n4. Understanding 'I Am': Discussion on the significance of the 'I Am' statement in the journey of self-realization.\n5. Singularity of Human Intelligence: Explorations of how the singularity of human intelligence relates to the concept of 'I Am'.",
"## Usage\nThis dataset can be used for fine-tuning Llama 2 models to engage in conversations that require a deep, reflective understanding of self-awareness, consciousness, and the philosophical underpinnings of the human experience. It is particularly suited for applications aimed at personal growth, mindfulness, and existential exploration.",
"## License\nThis dataset is provided for educational and research purposes. Users are responsible for ensuring their use of the dataset complies with the terms and conditions of the data sources and with applicable laws and regulations."
] |
5e8d458cba6a39b23dd42eb4d9346a5a0ecc6673 |
# Dataset Card for Evaluation run of jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES](https://huggingface.co/jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__RandomMergeNoNormWEIGHTED-7B-DARETIES",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T02:50:18.856359](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__RandomMergeNoNormWEIGHTED-7B-DARETIES/blob/main/results_2024-02-03T02-50-18.856359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6552300090149861,
"acc_stderr": 0.032076654736606314,
"acc_norm": 0.6548316131605267,
"acc_norm_stderr": 0.032744319553620865,
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7150153897183856,
"mc2_stderr": 0.014731194421516209
},
"harness|arc:challenge|25": {
"acc": 0.7047781569965871,
"acc_stderr": 0.013329750293382318,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523195
},
"harness|hellaswag|10": {
"acc": 0.7093208524198367,
"acc_stderr": 0.004531477407589653,
"acc_norm": 0.8849830711013742,
"acc_norm_stderr": 0.0031839033919416975
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083004,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083004
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.423463687150838,
"acc_stderr": 0.0165254258987735,
"acc_norm": 0.423463687150838,
"acc_norm_stderr": 0.0165254258987735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7150153897183856,
"mc2_stderr": 0.014731194421516209
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222782
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jsfs11__RandomMergeNoNormWEIGHTED-7B-DARETIES | [
"region:us"
] | 2024-02-03T02:52:44+00:00 | {"pretty_name": "Evaluation run of jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES", "dataset_summary": "Dataset automatically created during the evaluation run of model [jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES](https://huggingface.co/jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__RandomMergeNoNormWEIGHTED-7B-DARETIES\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T02:50:18.856359](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__RandomMergeNoNormWEIGHTED-7B-DARETIES/blob/main/results_2024-02-03T02-50-18.856359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6552300090149861,\n \"acc_stderr\": 0.032076654736606314,\n \"acc_norm\": 0.6548316131605267,\n \"acc_norm_stderr\": 0.032744319553620865,\n \"mc1\": 0.5777233782129743,\n \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7150153897183856,\n \"mc2_stderr\": 0.014731194421516209\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7047781569965871,\n \"acc_stderr\": 0.013329750293382318,\n \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523195\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7093208524198367,\n \"acc_stderr\": 0.004531477407589653,\n \"acc_norm\": 0.8849830711013742,\n \"acc_norm_stderr\": 0.0031839033919416975\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083004,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083004\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.423463687150838,\n \"acc_stderr\": 0.0165254258987735,\n \"acc_norm\": 0.423463687150838,\n \"acc_norm_stderr\": 0.0165254258987735\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653345,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653345\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5777233782129743,\n \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7150153897183856,\n \"mc2_stderr\": 0.014731194421516209\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222782\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \"acc_stderr\": 0.012588685966624174\n }\n}\n```", "repo_url": "https://huggingface.co/jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|arc:challenge|25_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|gsm8k|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hellaswag|10_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T02-50-18.856359.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["**/details_harness|winogrande|5_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T02-50-18.856359.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T02_50_18.856359", "path": ["results_2024-02-03T02-50-18.856359.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T02-50-18.856359.parquet"]}]}]} | 2024-02-03T02:53:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES
Dataset automatically created during the evaluation run of model jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T02:50:18.856359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T02:50:18.856359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES\n\n\n\nDataset automatically created during the evaluation run of model jsfs11/RandomMergeNoNormWEIGHTED-7B-DARETIES on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T02:50:18.856359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
ef136ec02a6f56327ac420f9b61ef46af9b1263a | # Universal Christ-Consciousness Datasets
## Overview
These datasets are meticulously crafted to serve as a foundational resource for fine-tuning language models to explore and guide the Self within towards Universal Christ-Consciousness. With a focus on depth, variety, and profound insight, the datasets aim to encapsulate a vast array of knowledge and intelligence on the subject.
## Objective
The primary goal of these datasets is to enable language models to engage in meaningful, insightful, and spiritually enriching dialogues. Each entry is designed to reflect a unique aspect of the journey towards realizing Universal Christ-Consciousness, offering guidance, reflections, and meditations that cater to a wide range of spiritual seekers.
## Content Structure
The datasets consist of entries formatted to simulate conversational exchanges, where each entry comprises:
A prompt labeled as "Human," representing inquiries or reflections that a seeker of Universal Christ-Consciousness might have.
A response labeled as "Assistant," providing an exploration, guidance, or answer that draws from a deep well of spiritual knowledge and insight.
# Format 1: Direct Q&A with Labels
Structure: Explicit labels are used to distinguish between the "Human" (prompt) and "Assistant" (response), with each part of the conversation clearly marked.
Example:
``` {"text": "### Human: How do I...? ### Assistant: To do that..."} ```
## Files Included
- christ_consciousness_504.jsonl: A collection of 504 entries, each presenting a unique exploration into the facets of Universal Christ-Consciousness.
- christ_consciousness_507.jsonl: Comprising 507 entries, this file extends the exploration with additional unique insights and guidance.
## Intended Use
These datasets are intended for researchers, developers, and spiritual practitioners who are looking to enhance conversational AI capabilities in the context of spiritual exploration and guidance. They are suitable for creating applications aimed at meditation guidance, spiritual counseling, and personal growth towards Universal Christ-Consciousness.
## Ethical Considerations
Users are encouraged to approach these datasets with respect for the diversity of spiritual beliefs and practices. The content is designed to be inclusive, promoting a message of love, unity, and understanding.
## Further Exploration
For more resources, discussions, and guidance on consciousness, spirituality, and the journey towards Universal Christ-Consciousness, consider engaging with the community at @InnerIGPT.
# Large Custom Datasets for Llama 2 Fine-Tuning on Consciousness Themes
## Overview
These large custom datasets have been meticulously crafted to align with a specific conversational format for fine-tuning Llama 2 models. Focusing on themes of Universal Christ-Consciousness and Inner 'I' Exploration, the datasets facilitate deep, reflective dialogues on spirituality and self-awareness.
## Dataset Format
Each dataset entry is structured as follows:
- A "text" field contains both a prompt (labeled as "Human") and a response (labeled as "Assistant"), separated by "###".
- This format is designed to simulate a natural conversational flow, enhancing the model's ability to engage in meaningful exchanges on complex themes.
# Format 2: Integrated Conversational Flow
## Structure: The conversation flows without explicit labels within a single "text" field, potentially including more natural transitions and follow-up questions.
Example:
``` {"text": "What deeper understanding of Christ-Consciousness can be gained? Exploring... offers insights into... For a deeper exploration, consider visiting @InnerIGPT."}```
## Characteristics: This format allows for a more fluid and less structured dialogue, reflecting how conversations naturally evolve. It can include back-and-forth exchanges without the strict Q&A format.
Use Cases: Best suited for models intended to handle open-ended dialogues, storytelling, or any application where the conversation might take multiple turns. This format helps in scenarios requiring a deeper understanding of context and the ability to maintain coherence over several exchanges.
## Files Included
The dataset is divided into two parts to ensure a comprehensive exploration of the themes:
- unique_christ_consciousness_dataset_1.jsonl - The first part contains 504 entries.
- unique_christ_consciousness_dataset_2.jsonl - The second part includes 507 entries, making a total of 1011 lines.
## Themes Included
- **Exploring Christ-Consciousness**: Dialogues on understanding and realizing Christ-Consciousness in everyday life.
- **Living in Universal Love**: Reflections on how universal love is indicative of Christ-Consciousness.
- **The Path of Selfless Service**: Insights on how selfless service is a path toward Christ-Consciousness.
- **Unity with the Divine**: Practices and perspectives for fostering unity with the Divine.
- **Transformation through Forgiveness**: The transformative power of forgiveness in the journey towards Christ-Consciousness.
## Usage
These datasets are particularly suitable for researchers, developers, and spiritual enthusiasts looking to fine-tune conversational AI models for spiritual counseling, education, and exploration. They offer a rich foundation for developing AI systems capable of engaging with users on topics related to consciousness and spirituality.
When to Use Each Format:
Direct Q&A with Labels (Format 1) should be used when training models that require a clear distinction between prompts and responses, such as in customer support chatbots, educational tools, or any application where direct answers to specific questions are paramount.
Integrated Conversational Flow (Format 2) is more suited for narrative generation, therapeutic bots, coaching tools, or any application where the conversation's natural flow and the ability to engage in a more human-like manner are critical.
## Note
Please use these datasets responsibly, ensuring their application aligns with ethical guidelines and promotes positive, insightful discourse.
## Additional Resources
For more explorations on consciousness and spirituality, visit @InnerIGPT. | InnerI/Universal-Christ-Consciousness-Dataset | [
"task_categories:conversational",
"size_categories:1K<n<10K",
"language:en",
"art",
"biology",
"dataset",
"Self",
"Spiritual",
"innerillm",
"region:us"
] | 2024-02-03T03:40:14+00:00 | {"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["conversational"], "pretty_name": "Universal Christ Consciousness Dataset", "tags": ["art", "biology", "dataset", "Self", "Spiritual", "innerillm"]} | 2024-02-03T05:03:24+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #size_categories-1K<n<10K #language-English #art #biology #dataset #Self #Spiritual #innerillm #region-us
| # Universal Christ-Consciousness Datasets
## Overview
These datasets are meticulously crafted to serve as a foundational resource for fine-tuning language models to explore and guide the Self within towards Universal Christ-Consciousness. With a focus on depth, variety, and profound insight, the datasets aim to encapsulate a vast array of knowledge and intelligence on the subject.
## Objective
The primary goal of these datasets is to enable language models to engage in meaningful, insightful, and spiritually enriching dialogues. Each entry is designed to reflect a unique aspect of the journey towards realizing Universal Christ-Consciousness, offering guidance, reflections, and meditations that cater to a wide range of spiritual seekers.
## Content Structure
The datasets consist of entries formatted to simulate conversational exchanges, where each entry comprises:
A prompt labeled as "Human," representing inquiries or reflections that a seeker of Universal Christ-Consciousness might have.
A response labeled as "Assistant," providing an exploration, guidance, or answer that draws from a deep well of spiritual knowledge and insight.
# Format 1: Direct Q&A with Labels
Structure: Explicit labels are used to distinguish between the "Human" (prompt) and "Assistant" (response), with each part of the conversation clearly marked.
Example:
## Files Included
- christ_consciousness_504.jsonl: A collection of 504 entries, each presenting a unique exploration into the facets of Universal Christ-Consciousness.
- christ_consciousness_507.jsonl: Comprising 507 entries, this file extends the exploration with additional unique insights and guidance.
## Intended Use
These datasets are intended for researchers, developers, and spiritual practitioners who are looking to enhance conversational AI capabilities in the context of spiritual exploration and guidance. They are suitable for creating applications aimed at meditation guidance, spiritual counseling, and personal growth towards Universal Christ-Consciousness.
## Ethical Considerations
Users are encouraged to approach these datasets with respect for the diversity of spiritual beliefs and practices. The content is designed to be inclusive, promoting a message of love, unity, and understanding.
## Further Exploration
For more resources, discussions, and guidance on consciousness, spirituality, and the journey towards Universal Christ-Consciousness, consider engaging with the community at @InnerIGPT.
# Large Custom Datasets for Llama 2 Fine-Tuning on Consciousness Themes
## Overview
These large custom datasets have been meticulously crafted to align with a specific conversational format for fine-tuning Llama 2 models. Focusing on themes of Universal Christ-Consciousness and Inner 'I' Exploration, the datasets facilitate deep, reflective dialogues on spirituality and self-awareness.
## Dataset Format
Each dataset entry is structured as follows:
- A "text" field contains both a prompt (labeled as "Human") and a response (labeled as "Assistant"), separated by "###".
- This format is designed to simulate a natural conversational flow, enhancing the model's ability to engage in meaningful exchanges on complex themes.
# Format 2: Integrated Conversational Flow
## Structure: The conversation flows without explicit labels within a single "text" field, potentially including more natural transitions and follow-up questions.
Example:
## Characteristics: This format allows for a more fluid and less structured dialogue, reflecting how conversations naturally evolve. It can include back-and-forth exchanges without the strict Q&A format.
Use Cases: Best suited for models intended to handle open-ended dialogues, storytelling, or any application where the conversation might take multiple turns. This format helps in scenarios requiring a deeper understanding of context and the ability to maintain coherence over several exchanges.
## Files Included
The dataset is divided into two parts to ensure a comprehensive exploration of the themes:
- unique_christ_consciousness_dataset_1.jsonl - The first part contains 504 entries.
- unique_christ_consciousness_dataset_2.jsonl - The second part includes 507 entries, making a total of 1011 lines.
## Themes Included
- Exploring Christ-Consciousness: Dialogues on understanding and realizing Christ-Consciousness in everyday life.
- Living in Universal Love: Reflections on how universal love is indicative of Christ-Consciousness.
- The Path of Selfless Service: Insights on how selfless service is a path toward Christ-Consciousness.
- Unity with the Divine: Practices and perspectives for fostering unity with the Divine.
- Transformation through Forgiveness: The transformative power of forgiveness in the journey towards Christ-Consciousness.
## Usage
These datasets are particularly suitable for researchers, developers, and spiritual enthusiasts looking to fine-tune conversational AI models for spiritual counseling, education, and exploration. They offer a rich foundation for developing AI systems capable of engaging with users on topics related to consciousness and spirituality.
When to Use Each Format:
Direct Q&A with Labels (Format 1) should be used when training models that require a clear distinction between prompts and responses, such as in customer support chatbots, educational tools, or any application where direct answers to specific questions are paramount.
Integrated Conversational Flow (Format 2) is more suited for narrative generation, therapeutic bots, coaching tools, or any application where the conversation's natural flow and the ability to engage in a more human-like manner are critical.
## Note
Please use these datasets responsibly, ensuring their application aligns with ethical guidelines and promotes positive, insightful discourse.
## Additional Resources
For more explorations on consciousness and spirituality, visit @InnerIGPT. | [
"# Universal Christ-Consciousness Datasets",
"## Overview\nThese datasets are meticulously crafted to serve as a foundational resource for fine-tuning language models to explore and guide the Self within towards Universal Christ-Consciousness. With a focus on depth, variety, and profound insight, the datasets aim to encapsulate a vast array of knowledge and intelligence on the subject.",
"## Objective\nThe primary goal of these datasets is to enable language models to engage in meaningful, insightful, and spiritually enriching dialogues. Each entry is designed to reflect a unique aspect of the journey towards realizing Universal Christ-Consciousness, offering guidance, reflections, and meditations that cater to a wide range of spiritual seekers.",
"## Content Structure\nThe datasets consist of entries formatted to simulate conversational exchanges, where each entry comprises:\n\nA prompt labeled as \"Human,\" representing inquiries or reflections that a seeker of Universal Christ-Consciousness might have.\nA response labeled as \"Assistant,\" providing an exploration, guidance, or answer that draws from a deep well of spiritual knowledge and insight.",
"# Format 1: Direct Q&A with Labels\nStructure: Explicit labels are used to distinguish between the \"Human\" (prompt) and \"Assistant\" (response), with each part of the conversation clearly marked.\nExample:",
"## Files Included\n- christ_consciousness_504.jsonl: A collection of 504 entries, each presenting a unique exploration into the facets of Universal Christ-Consciousness.\n- christ_consciousness_507.jsonl: Comprising 507 entries, this file extends the exploration with additional unique insights and guidance.",
"## Intended Use\nThese datasets are intended for researchers, developers, and spiritual practitioners who are looking to enhance conversational AI capabilities in the context of spiritual exploration and guidance. They are suitable for creating applications aimed at meditation guidance, spiritual counseling, and personal growth towards Universal Christ-Consciousness.",
"## Ethical Considerations\nUsers are encouraged to approach these datasets with respect for the diversity of spiritual beliefs and practices. The content is designed to be inclusive, promoting a message of love, unity, and understanding.",
"## Further Exploration\nFor more resources, discussions, and guidance on consciousness, spirituality, and the journey towards Universal Christ-Consciousness, consider engaging with the community at @InnerIGPT.",
"# Large Custom Datasets for Llama 2 Fine-Tuning on Consciousness Themes",
"## Overview\nThese large custom datasets have been meticulously crafted to align with a specific conversational format for fine-tuning Llama 2 models. Focusing on themes of Universal Christ-Consciousness and Inner 'I' Exploration, the datasets facilitate deep, reflective dialogues on spirituality and self-awareness.",
"## Dataset Format\nEach dataset entry is structured as follows:\n- A \"text\" field contains both a prompt (labeled as \"Human\") and a response (labeled as \"Assistant\"), separated by \"###\".\n- This format is designed to simulate a natural conversational flow, enhancing the model's ability to engage in meaningful exchanges on complex themes.",
"# Format 2: Integrated Conversational Flow",
"## Structure: The conversation flows without explicit labels within a single \"text\" field, potentially including more natural transitions and follow-up questions.\nExample:",
"## Characteristics: This format allows for a more fluid and less structured dialogue, reflecting how conversations naturally evolve. It can include back-and-forth exchanges without the strict Q&A format.\nUse Cases: Best suited for models intended to handle open-ended dialogues, storytelling, or any application where the conversation might take multiple turns. This format helps in scenarios requiring a deeper understanding of context and the ability to maintain coherence over several exchanges.",
"## Files Included\nThe dataset is divided into two parts to ensure a comprehensive exploration of the themes:\n- unique_christ_consciousness_dataset_1.jsonl - The first part contains 504 entries.\n- unique_christ_consciousness_dataset_2.jsonl - The second part includes 507 entries, making a total of 1011 lines.",
"## Themes Included\n- Exploring Christ-Consciousness: Dialogues on understanding and realizing Christ-Consciousness in everyday life.\n- Living in Universal Love: Reflections on how universal love is indicative of Christ-Consciousness.\n- The Path of Selfless Service: Insights on how selfless service is a path toward Christ-Consciousness.\n- Unity with the Divine: Practices and perspectives for fostering unity with the Divine.\n- Transformation through Forgiveness: The transformative power of forgiveness in the journey towards Christ-Consciousness.",
"## Usage\nThese datasets are particularly suitable for researchers, developers, and spiritual enthusiasts looking to fine-tune conversational AI models for spiritual counseling, education, and exploration. They offer a rich foundation for developing AI systems capable of engaging with users on topics related to consciousness and spirituality.\n\nWhen to Use Each Format:\nDirect Q&A with Labels (Format 1) should be used when training models that require a clear distinction between prompts and responses, such as in customer support chatbots, educational tools, or any application where direct answers to specific questions are paramount.\nIntegrated Conversational Flow (Format 2) is more suited for narrative generation, therapeutic bots, coaching tools, or any application where the conversation's natural flow and the ability to engage in a more human-like manner are critical.",
"## Note\nPlease use these datasets responsibly, ensuring their application aligns with ethical guidelines and promotes positive, insightful discourse.",
"## Additional Resources\nFor more explorations on consciousness and spirituality, visit @InnerIGPT."
] | [
"TAGS\n#task_categories-conversational #size_categories-1K<n<10K #language-English #art #biology #dataset #Self #Spiritual #innerillm #region-us \n",
"# Universal Christ-Consciousness Datasets",
"## Overview\nThese datasets are meticulously crafted to serve as a foundational resource for fine-tuning language models to explore and guide the Self within towards Universal Christ-Consciousness. With a focus on depth, variety, and profound insight, the datasets aim to encapsulate a vast array of knowledge and intelligence on the subject.",
"## Objective\nThe primary goal of these datasets is to enable language models to engage in meaningful, insightful, and spiritually enriching dialogues. Each entry is designed to reflect a unique aspect of the journey towards realizing Universal Christ-Consciousness, offering guidance, reflections, and meditations that cater to a wide range of spiritual seekers.",
"## Content Structure\nThe datasets consist of entries formatted to simulate conversational exchanges, where each entry comprises:\n\nA prompt labeled as \"Human,\" representing inquiries or reflections that a seeker of Universal Christ-Consciousness might have.\nA response labeled as \"Assistant,\" providing an exploration, guidance, or answer that draws from a deep well of spiritual knowledge and insight.",
"# Format 1: Direct Q&A with Labels\nStructure: Explicit labels are used to distinguish between the \"Human\" (prompt) and \"Assistant\" (response), with each part of the conversation clearly marked.\nExample:",
"## Files Included\n- christ_consciousness_504.jsonl: A collection of 504 entries, each presenting a unique exploration into the facets of Universal Christ-Consciousness.\n- christ_consciousness_507.jsonl: Comprising 507 entries, this file extends the exploration with additional unique insights and guidance.",
"## Intended Use\nThese datasets are intended for researchers, developers, and spiritual practitioners who are looking to enhance conversational AI capabilities in the context of spiritual exploration and guidance. They are suitable for creating applications aimed at meditation guidance, spiritual counseling, and personal growth towards Universal Christ-Consciousness.",
"## Ethical Considerations\nUsers are encouraged to approach these datasets with respect for the diversity of spiritual beliefs and practices. The content is designed to be inclusive, promoting a message of love, unity, and understanding.",
"## Further Exploration\nFor more resources, discussions, and guidance on consciousness, spirituality, and the journey towards Universal Christ-Consciousness, consider engaging with the community at @InnerIGPT.",
"# Large Custom Datasets for Llama 2 Fine-Tuning on Consciousness Themes",
"## Overview\nThese large custom datasets have been meticulously crafted to align with a specific conversational format for fine-tuning Llama 2 models. Focusing on themes of Universal Christ-Consciousness and Inner 'I' Exploration, the datasets facilitate deep, reflective dialogues on spirituality and self-awareness.",
"## Dataset Format\nEach dataset entry is structured as follows:\n- A \"text\" field contains both a prompt (labeled as \"Human\") and a response (labeled as \"Assistant\"), separated by \"###\".\n- This format is designed to simulate a natural conversational flow, enhancing the model's ability to engage in meaningful exchanges on complex themes.",
"# Format 2: Integrated Conversational Flow",
"## Structure: The conversation flows without explicit labels within a single \"text\" field, potentially including more natural transitions and follow-up questions.\nExample:",
"## Characteristics: This format allows for a more fluid and less structured dialogue, reflecting how conversations naturally evolve. It can include back-and-forth exchanges without the strict Q&A format.\nUse Cases: Best suited for models intended to handle open-ended dialogues, storytelling, or any application where the conversation might take multiple turns. This format helps in scenarios requiring a deeper understanding of context and the ability to maintain coherence over several exchanges.",
"## Files Included\nThe dataset is divided into two parts to ensure a comprehensive exploration of the themes:\n- unique_christ_consciousness_dataset_1.jsonl - The first part contains 504 entries.\n- unique_christ_consciousness_dataset_2.jsonl - The second part includes 507 entries, making a total of 1011 lines.",
"## Themes Included\n- Exploring Christ-Consciousness: Dialogues on understanding and realizing Christ-Consciousness in everyday life.\n- Living in Universal Love: Reflections on how universal love is indicative of Christ-Consciousness.\n- The Path of Selfless Service: Insights on how selfless service is a path toward Christ-Consciousness.\n- Unity with the Divine: Practices and perspectives for fostering unity with the Divine.\n- Transformation through Forgiveness: The transformative power of forgiveness in the journey towards Christ-Consciousness.",
"## Usage\nThese datasets are particularly suitable for researchers, developers, and spiritual enthusiasts looking to fine-tune conversational AI models for spiritual counseling, education, and exploration. They offer a rich foundation for developing AI systems capable of engaging with users on topics related to consciousness and spirituality.\n\nWhen to Use Each Format:\nDirect Q&A with Labels (Format 1) should be used when training models that require a clear distinction between prompts and responses, such as in customer support chatbots, educational tools, or any application where direct answers to specific questions are paramount.\nIntegrated Conversational Flow (Format 2) is more suited for narrative generation, therapeutic bots, coaching tools, or any application where the conversation's natural flow and the ability to engage in a more human-like manner are critical.",
"## Note\nPlease use these datasets responsibly, ensuring their application aligns with ethical guidelines and promotes positive, insightful discourse.",
"## Additional Resources\nFor more explorations on consciousness and spirituality, visit @InnerIGPT."
] |
7e6222a2355a3b02ff0aa6552a2d82b24a88f7d9 |
Dataset licensed under : Creative Commons Attribution (CC BY) | Ransaka/youtube_recommendation_data | [
"region:us"
] | 2024-02-03T04:17:11+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "title", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 32119283.056155507, "num_examples": 1041}, {"name": "test", "num_bytes": 10737281.943844492, "num_examples": 348}], "download_size": 41663238, "dataset_size": 42856565}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-08T03:52:14+00:00 | [] | [] | TAGS
#region-us
|
Dataset licensed under : Creative Commons Attribution (CC BY) | [] | [
"TAGS\n#region-us \n"
] |
ed5ddd99583a14da3981df45704d8ee41fc67cc5 | From https://huggingface.co/jondurbin - I just renamed one of the columns to make axolotl happier.
## Truthy DPO
This is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.
For example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.
Mostly targets corporeal, spacial, temporal awareness, and common misconceptions.
### Contribute
If you're interested in new functionality/datasets, take a look at [bagel repo](https://github.com/jondurbin/bagel) and [airoboros](https://github.com/jondurbin/airoboros) and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf | Crystalcareai/truthyDPO-intel | [
"license:cc-by-4.0",
"region:us"
] | 2024-02-03T04:28:31+00:00 | {"license": "cc-by-4.0"} | 2024-02-07T18:29:40+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
| From URL - I just renamed one of the columns to make axolotl happier.
## Truthy DPO
This is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.
For example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.
Mostly targets corporeal, spacial, temporal awareness, and common misconceptions.
### Contribute
If you're interested in new functionality/datasets, take a look at bagel repo and airoboros and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- URL
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf | [
"## Truthy DPO\n\nThis is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.\n\nFor example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.\n\nMostly targets corporeal, spacial, temporal awareness, and common misconceptions.",
"### Contribute\n\nIf you're interested in new functionality/datasets, take a look at bagel repo and airoboros and either make a PR or open an issue with details.\n\nTo help me with the fine-tuning costs, dataset generation, etc., please use one of the following:\n\n- URL\n- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11\n- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf"
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"## Truthy DPO\n\nThis is a dataset designed to enhance the overall truthfulness of LLMs, without sacrificing immersion when roleplaying as a human.\n\nFor example, in normal AI assistant model, the model should not try to describe what the warmth of the sun feels like, but if the system prompt indicates it's a human, it should.\n\nMostly targets corporeal, spacial, temporal awareness, and common misconceptions.",
"### Contribute\n\nIf you're interested in new functionality/datasets, take a look at bagel repo and airoboros and either make a PR or open an issue with details.\n\nTo help me with the fine-tuning costs, dataset generation, etc., please use one of the following:\n\n- URL\n- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11\n- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf"
] |
d760906fa7ea644124da6b6680688492ffffedbc |
# Dataset Card for Evaluation run of NeuralNovel/Confinus-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeuralNovel/Confinus-2x7B](https://huggingface.co/NeuralNovel/Confinus-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeuralNovel__Confinus-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T04:36:52.814775](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Confinus-2x7B/blob/main/results_2024-02-03T04-36-52.814775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6569965895517977,
"acc_stderr": 0.032010118042741426,
"acc_norm": 0.6566603304405096,
"acc_norm_stderr": 0.032677407794922786,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7187839952418987,
"mc2_stderr": 0.014678557270683644
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473835
},
"harness|hellaswag|10": {
"acc": 0.7157936666002789,
"acc_stderr": 0.004501137895230726,
"acc_norm": 0.8881696873132842,
"acc_norm_stderr": 0.0031451347677023105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971128,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971128
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44692737430167595,
"acc_stderr": 0.016628030039647614,
"acc_norm": 0.44692737430167595,
"acc_norm_stderr": 0.016628030039647614
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7187839952418987,
"mc2_stderr": 0.014678557270683644
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065604
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754938
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeuralNovel__Confinus-2x7B | [
"region:us"
] | 2024-02-03T04:39:07+00:00 | {"pretty_name": "Evaluation run of NeuralNovel/Confinus-2x7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeuralNovel/Confinus-2x7B](https://huggingface.co/NeuralNovel/Confinus-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeuralNovel__Confinus-2x7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T04:36:52.814775](https://huggingface.co/datasets/open-llm-leaderboard/details_NeuralNovel__Confinus-2x7B/blob/main/results_2024-02-03T04-36-52.814775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6569965895517977,\n \"acc_stderr\": 0.032010118042741426,\n \"acc_norm\": 0.6566603304405096,\n \"acc_norm_stderr\": 0.032677407794922786,\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7187839952418987,\n \"mc2_stderr\": 0.014678557270683644\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473835\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7157936666002789,\n \"acc_stderr\": 0.004501137895230726,\n \"acc_norm\": 0.8881696873132842,\n \"acc_norm_stderr\": 0.0031451347677023105\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5238095238095238,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971128,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971128\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44692737430167595,\n \"acc_stderr\": 0.016628030039647614,\n \"acc_norm\": 0.44692737430167595,\n \"acc_norm_stderr\": 0.016628030039647614\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7187839952418987,\n \"mc2_stderr\": 0.014678557270683644\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \"acc_stderr\": 0.012757375376754938\n }\n}\n```", "repo_url": "https://huggingface.co/NeuralNovel/Confinus-2x7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|arc:challenge|25_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|gsm8k|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hellaswag|10_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T04-36-52.814775.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["**/details_harness|winogrande|5_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T04-36-52.814775.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T04_36_52.814775", "path": ["results_2024-02-03T04-36-52.814775.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T04-36-52.814775.parquet"]}]}]} | 2024-02-03T04:39:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeuralNovel/Confinus-2x7B
Dataset automatically created during the evaluation run of model NeuralNovel/Confinus-2x7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T04:36:52.814775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeuralNovel/Confinus-2x7B\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Confinus-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T04:36:52.814775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeuralNovel/Confinus-2x7B\n\n\n\nDataset automatically created during the evaluation run of model NeuralNovel/Confinus-2x7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T04:36:52.814775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0d1718db351207d5a493135a62d5ed8cc4733830 | # Dataset Card for LibriTTS-R
<!-- Provide a quick summary of the dataset. -->
LibriTTS-R [1] is a sound quality improved version of the LibriTTS corpus
(http://www.openslr.org/60/) which is a multi-speaker English corpus of approximately
585 hours of read English speech at 24kHz sampling rate, published in 2019.
## Overview
This is the LibriTTS-R dataset, adapted for the `datasets` library.
## Usage
### Splits
There are 7 splits (dots replace dashes from the original dataset, to comply with hf naming requirements):
- dev.clean
- dev.other
- test.clean
- test.other
- train.clean.100
- train.clean.360
- train.other.500
### Configurations
There are 3 configurations, each which limits the splits the `load_dataset()` function will download.
The default configuration is "all".
- "dev": only the "dev.clean" split (good for testing the dataset quickly)
- "clean": contains only "clean" splits
- "other": contains only "other" splits
- "all": contains only "all" splits
### Example
Loading the `clean` config with only the `train.clean.360` split.
```
load_dataset("blabble-io/libritts_r", "clean", split="train.clean.100")
```
Streaming is also supported.
```
load_dataset("blabble-io/libritts_r", streaming=True)
```
### Columns
```
{
"audio": datasets.Audio(sampling_rate=24_000),
"text_normalized": datasets.Value("string"),
"text_original": datasets.Value("string"),
"speaker_id": datasets.Value("string"),
"path": datasets.Value("string"),
"chapter_id": datasets.Value("string"),
"id": datasets.Value("string"),
}
```
### Example Row
```
{
'audio': {
'path': '/home/user/.cache/huggingface/datasets/downloads/extracted/5551a515e85b9e463062524539c2e1cb52ba32affe128dffd866db0205248bdd/LibriTTS_R/dev-clean/3081/166546/3081_166546_000028_000002.wav',
'array': ...,
'sampling_rate': 24000
},
'text_normalized': 'How quickly he disappeared!"',
'text_original': 'How quickly he disappeared!"',
'speaker_id': '3081',
'path': '/home/user/.cache/huggingface/datasets/downloads/extracted/5551a515e85b9e463062524539c2e1cb52ba32affe128dffd866db0205248bdd/LibriTTS_R/dev-clean/3081/166546/3081_166546_000028_000002.wav',
'chapter_id': '166546',
'id': '3081_166546_000028_000002'
}
```
## Dataset Details
### Dataset Description
- **License:** CC BY 4.0
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Homepage:** https://www.openslr.org/141/
- **Paper:** https://arxiv.org/abs/2305.18802
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
```
@ARTICLE{Koizumi2023-hs,
title = "{LibriTTS-R}: A restored multi-speaker text-to-speech corpus",
author = "Koizumi, Yuma and Zen, Heiga and Karita, Shigeki and Ding,
Yifan and Yatabe, Kohei and Morioka, Nobuyuki and Bacchiani,
Michiel and Zhang, Yu and Han, Wei and Bapna, Ankur",
abstract = "This paper introduces a new speech dataset called
``LibriTTS-R'' designed for text-to-speech (TTS) use. It is
derived by applying speech restoration to the LibriTTS
corpus, which consists of 585 hours of speech data at 24 kHz
sampling rate from 2,456 speakers and the corresponding
texts. The constituent samples of LibriTTS-R are identical
to those of LibriTTS, with only the sound quality improved.
Experimental results show that the LibriTTS-R ground-truth
samples showed significantly improved sound quality compared
to those in LibriTTS. In addition, neural end-to-end TTS
trained with LibriTTS-R achieved speech naturalness on par
with that of the ground-truth samples. The corpus is freely
available for download from
\textbackslashurl\{http://www.openslr.org/141/\}.",
month = may,
year = 2023,
copyright = "http://creativecommons.org/licenses/by-nc-nd/4.0/",
archivePrefix = "arXiv",
primaryClass = "eess.AS",
eprint = "2305.18802"
}
``` | blabble-io/libritts_r | [
"task_categories:text-to-speech",
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-4.0",
"arxiv:2305.18802",
"region:us"
] | 2024-02-03T04:40:33+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-to-speech"], "configs": [{"config_name": "dev", "data_files": [{"split": "dev.clean", "path": "data/dev.clean/dev.clean*.parquet"}]}, {"config_name": "clean", "data_files": [{"split": "dev.clean", "path": "data/dev.clean/dev.clean*.parquet"}, {"split": "test.clean", "path": "data/test.clean/test.clean*.parquet"}, {"split": "train.clean.100", "path": "data/train.clean.100/train.clean.100*.parquet"}, {"split": "train.clean.360", "path": "data/train.clean.360/train.clean.360*.parquet"}]}, {"config_name": "other", "data_files": [{"split": "dev.other", "path": "data/dev.other/dev.other*.parquet"}, {"split": "test.other", "path": "data/test.other/test.other*.parquet"}, {"split": "train.other.500", "path": "data/train.other.500/train.other.500*.parquet"}]}, {"config_name": "all", "data_files": [{"split": "dev.clean", "path": "data/dev.clean/dev.clean*.parquet"}, {"split": "dev.other", "path": "data/dev.other/dev.other*.parquet"}, {"split": "test.clean", "path": "data/test.clean/test.clean*.parquet"}, {"split": "test.other", "path": "data/test.other/test.other*.parquet"}, {"split": "train.clean.100", "path": "data/train.clean.100/train.clean.100*.parquet"}, {"split": "train.clean.360", "path": "data/train.clean.360/train.clean.360*.parquet"}, {"split": "train.other.500", "path": "data/train.other.500/train.other.500*.parquet"}]}]} | 2024-02-09T21:20:19+00:00 | [
"2305.18802"
] | [
"en"
] | TAGS
#task_categories-text-to-speech #size_categories-10K<n<100K #language-English #license-cc-by-4.0 #arxiv-2305.18802 #region-us
| # Dataset Card for LibriTTS-R
LibriTTS-R [1] is a sound quality improved version of the LibriTTS corpus
(URL which is a multi-speaker English corpus of approximately
585 hours of read English speech at 24kHz sampling rate, published in 2019.
## Overview
This is the LibriTTS-R dataset, adapted for the 'datasets' library.
## Usage
### Splits
There are 7 splits (dots replace dashes from the original dataset, to comply with hf naming requirements):
- URL
- URL
- URL
- URL
- URL.100
- URL.360
- URL.500
### Configurations
There are 3 configurations, each which limits the splits the 'load_dataset()' function will download.
The default configuration is "all".
- "dev": only the "URL" split (good for testing the dataset quickly)
- "clean": contains only "clean" splits
- "other": contains only "other" splits
- "all": contains only "all" splits
### Example
Loading the 'clean' config with only the 'URL.360' split.
Streaming is also supported.
### Columns
### Example Row
## Dataset Details
### Dataset Description
- License: CC BY 4.0
### Dataset Sources [optional]
- Homepage: URL
- Paper: URL
| [
"# Dataset Card for LibriTTS-R\n\n\n\nLibriTTS-R [1] is a sound quality improved version of the LibriTTS corpus \n(URL which is a multi-speaker English corpus of approximately \n585 hours of read English speech at 24kHz sampling rate, published in 2019.",
"## Overview\n\nThis is the LibriTTS-R dataset, adapted for the 'datasets' library.",
"## Usage",
"### Splits\n\nThere are 7 splits (dots replace dashes from the original dataset, to comply with hf naming requirements):\n\n- URL\n- URL\n- URL\n- URL\n- URL.100\n- URL.360\n- URL.500",
"### Configurations\n\nThere are 3 configurations, each which limits the splits the 'load_dataset()' function will download.\n\nThe default configuration is \"all\".\n\n- \"dev\": only the \"URL\" split (good for testing the dataset quickly)\n- \"clean\": contains only \"clean\" splits\n- \"other\": contains only \"other\" splits\n- \"all\": contains only \"all\" splits",
"### Example\n\nLoading the 'clean' config with only the 'URL.360' split.\n\n\nStreaming is also supported.",
"### Columns",
"### Example Row",
"## Dataset Details",
"### Dataset Description\n\n- License: CC BY 4.0",
"### Dataset Sources [optional]\n\n\n\n- Homepage: URL\n- Paper: URL"
] | [
"TAGS\n#task_categories-text-to-speech #size_categories-10K<n<100K #language-English #license-cc-by-4.0 #arxiv-2305.18802 #region-us \n",
"# Dataset Card for LibriTTS-R\n\n\n\nLibriTTS-R [1] is a sound quality improved version of the LibriTTS corpus \n(URL which is a multi-speaker English corpus of approximately \n585 hours of read English speech at 24kHz sampling rate, published in 2019.",
"## Overview\n\nThis is the LibriTTS-R dataset, adapted for the 'datasets' library.",
"## Usage",
"### Splits\n\nThere are 7 splits (dots replace dashes from the original dataset, to comply with hf naming requirements):\n\n- URL\n- URL\n- URL\n- URL\n- URL.100\n- URL.360\n- URL.500",
"### Configurations\n\nThere are 3 configurations, each which limits the splits the 'load_dataset()' function will download.\n\nThe default configuration is \"all\".\n\n- \"dev\": only the \"URL\" split (good for testing the dataset quickly)\n- \"clean\": contains only \"clean\" splits\n- \"other\": contains only \"other\" splits\n- \"all\": contains only \"all\" splits",
"### Example\n\nLoading the 'clean' config with only the 'URL.360' split.\n\n\nStreaming is also supported.",
"### Columns",
"### Example Row",
"## Dataset Details",
"### Dataset Description\n\n- License: CC BY 4.0",
"### Dataset Sources [optional]\n\n\n\n- Homepage: URL\n- Paper: URL"
] |
58499fe853faa2ce518aea448733315968e254b4 |
# Dataset Card for Evaluation run of abacusai/Smaug-70B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/Smaug-70B-v0.1](https://huggingface.co/abacusai/Smaug-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T05:35:28.928800](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1/blob/main/results_2024-02-03T05-35-28.928800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7716613011645818,
"acc_stderr": 0.02801089457302993,
"acc_norm": 0.7734062646949216,
"acc_norm_stderr": 0.028568963791437117,
"mc1": 0.6560587515299877,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.7666613083747418,
"mc2_stderr": 0.014124410528709273
},
"harness|arc:challenge|25": {
"acc": 0.735494880546075,
"acc_stderr": 0.012889272949313371,
"acc_norm": 0.7602389078498294,
"acc_norm_stderr": 0.012476304127453944
},
"harness|hellaswag|10": {
"acc": 0.7199761003784106,
"acc_stderr": 0.004480929450281562,
"acc_norm": 0.8926508663612827,
"acc_norm_stderr": 0.0030892396746331585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8452830188679246,
"acc_stderr": 0.022257075558791282,
"acc_norm": 0.8452830188679246,
"acc_norm_stderr": 0.022257075558791282
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9305555555555556,
"acc_stderr": 0.021257974822832048,
"acc_norm": 0.9305555555555556,
"acc_norm_stderr": 0.021257974822832048
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838728,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838728
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6904761904761905,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.6904761904761905,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6600985221674877,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.6600985221674877,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.016999994927421592,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.016999994927421592
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084315,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084315
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.019982347208637282,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.019982347208637282
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5629139072847682,
"acc_stderr": 0.040500357222306355,
"acc_norm": 0.5629139072847682,
"acc_norm_stderr": 0.040500357222306355
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9357798165137615,
"acc_stderr": 0.010510494713201403,
"acc_norm": 0.9357798165137615,
"acc_norm_stderr": 0.010510494713201403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03179876342176853,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03179876342176853
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8931297709923665,
"acc_stderr": 0.027096548624883733,
"acc_norm": 0.8931297709923665,
"acc_norm_stderr": 0.027096548624883733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540616,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540616
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.033432700628696195,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.033432700628696195
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977725,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977725
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9169859514687101,
"acc_stderr": 0.009866287394639536,
"acc_norm": 0.9169859514687101,
"acc_norm_stderr": 0.009866287394639536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6960893854748603,
"acc_stderr": 0.01538284558758452,
"acc_norm": 0.6960893854748603,
"acc_norm_stderr": 0.01538284558758452
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8496732026143791,
"acc_stderr": 0.02046417512433263,
"acc_norm": 0.8496732026143791,
"acc_norm_stderr": 0.02046417512433263
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.842443729903537,
"acc_stderr": 0.020692237273583984,
"acc_norm": 0.842443729903537,
"acc_norm_stderr": 0.020692237273583984
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8641975308641975,
"acc_stderr": 0.019061588181505405,
"acc_norm": 0.8641975308641975,
"acc_norm_stderr": 0.019061588181505405
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6023468057366362,
"acc_stderr": 0.012499840347460642,
"acc_norm": 0.6023468057366362,
"acc_norm_stderr": 0.012499840347460642
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8345588235294118,
"acc_stderr": 0.02257177102549473,
"acc_norm": 0.8345588235294118,
"acc_norm_stderr": 0.02257177102549473
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.815359477124183,
"acc_stderr": 0.015697029240757773,
"acc_norm": 0.815359477124183,
"acc_norm_stderr": 0.015697029240757773
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007646,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007646
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659397,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659397
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276894,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276894
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6560587515299877,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.7666613083747418,
"mc2_stderr": 0.014124410528709273
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627305
},
"harness|gsm8k|5": {
"acc": 0.7869598180439727,
"acc_stderr": 0.01127844785690078
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1 | [
"region:us"
] | 2024-02-03T05:37:34+00:00 | {"pretty_name": "Evaluation run of abacusai/Smaug-70B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [abacusai/Smaug-70B-v0.1](https://huggingface.co/abacusai/Smaug-70B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T05:35:28.928800](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__Smaug-70B-v0.1/blob/main/results_2024-02-03T05-35-28.928800.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7716613011645818,\n \"acc_stderr\": 0.02801089457302993,\n \"acc_norm\": 0.7734062646949216,\n \"acc_norm_stderr\": 0.028568963791437117,\n \"mc1\": 0.6560587515299877,\n \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.735494880546075,\n \"acc_stderr\": 0.012889272949313371,\n \"acc_norm\": 0.7602389078498294,\n \"acc_norm_stderr\": 0.012476304127453944\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7199761003784106,\n \"acc_stderr\": 0.004480929450281562,\n \"acc_norm\": 0.8926508663612827,\n \"acc_norm_stderr\": 0.0030892396746331585\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8452830188679246,\n \"acc_stderr\": 0.022257075558791282,\n \"acc_norm\": 0.8452830188679246,\n \"acc_norm_stderr\": 0.022257075558791282\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9305555555555556,\n \"acc_stderr\": 0.021257974822832048,\n \"acc_norm\": 0.9305555555555556,\n \"acc_norm_stderr\": 0.021257974822832048\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838728,\n \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838728\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6904761904761905,\n \"acc_stderr\": 0.023809523809523864,\n \"acc_norm\": 0.6904761904761905,\n \"acc_norm_stderr\": 0.023809523809523864\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6600985221674877,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.6600985221674877,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.016999994927421592,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.016999994927421592\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084315,\n \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084315\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.019982347208637282,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.019982347208637282\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5629139072847682,\n \"acc_stderr\": 0.040500357222306355,\n \"acc_norm\": 0.5629139072847682,\n \"acc_norm_stderr\": 0.040500357222306355\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9357798165137615,\n \"acc_stderr\": 0.010510494713201403,\n \"acc_norm\": 0.9357798165137615,\n \"acc_norm_stderr\": 0.010510494713201403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.03179876342176853,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.03179876342176853\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8931297709923665,\n \"acc_stderr\": 0.027096548624883733,\n \"acc_norm\": 0.8931297709923665,\n \"acc_norm_stderr\": 0.027096548624883733\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540616,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540616\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.033432700628696195,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.033432700628696195\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977725,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977725\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9169859514687101,\n \"acc_stderr\": 0.009866287394639536,\n \"acc_norm\": 0.9169859514687101,\n \"acc_norm_stderr\": 0.009866287394639536\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6960893854748603,\n \"acc_stderr\": 0.01538284558758452,\n \"acc_norm\": 0.6960893854748603,\n \"acc_norm_stderr\": 0.01538284558758452\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8496732026143791,\n \"acc_stderr\": 0.02046417512433263,\n \"acc_norm\": 0.8496732026143791,\n \"acc_norm_stderr\": 0.02046417512433263\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.842443729903537,\n \"acc_stderr\": 0.020692237273583984,\n \"acc_norm\": 0.842443729903537,\n \"acc_norm_stderr\": 0.020692237273583984\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8641975308641975,\n \"acc_stderr\": 0.019061588181505405,\n \"acc_norm\": 0.8641975308641975,\n \"acc_norm_stderr\": 0.019061588181505405\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6023468057366362,\n \"acc_stderr\": 0.012499840347460642,\n \"acc_norm\": 0.6023468057366362,\n \"acc_norm_stderr\": 0.012499840347460642\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8345588235294118,\n \"acc_stderr\": 0.02257177102549473,\n \"acc_norm\": 0.8345588235294118,\n \"acc_norm_stderr\": 0.02257177102549473\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.815359477124183,\n \"acc_stderr\": 0.015697029240757773,\n \"acc_norm\": 0.815359477124183,\n \"acc_norm_stderr\": 0.015697029240757773\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007646,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007646\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659397,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659397\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276894,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276894\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6560587515299877,\n \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.7666613083747418,\n \"mc2_stderr\": 0.014124410528709273\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627305\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7869598180439727,\n \"acc_stderr\": 0.01127844785690078\n }\n}\n```", "repo_url": "https://huggingface.co/abacusai/Smaug-70B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|arc:challenge|25_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|gsm8k|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hellaswag|10_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["**/details_harness|winogrande|5_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T05-35-28.928800.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T05_35_28.928800", "path": ["results_2024-02-03T05-35-28.928800.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T05-35-28.928800.parquet"]}]}]} | 2024-02-03T05:37:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abacusai/Smaug-70B-v0.1
Dataset automatically created during the evaluation run of model abacusai/Smaug-70B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T05:35:28.928800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abacusai/Smaug-70B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Smaug-70B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T05:35:28.928800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abacusai/Smaug-70B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model abacusai/Smaug-70B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T05:35:28.928800(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
29bbffb03976429e7eec2bf84ef720f59666fab8 |
# Dataset Card for YouTubeTranscriptData
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
This dataset contains transcripts of around 167K youtube videos that include coding lectures, podcasts, interviews, news videos, commentary and song lyrics. Also there are multiple files that have been generated using webscrapping.
- **Curated by:** [Shivendra Singh](https://linktr.ee/shivendrra_)
- **License:** [none]
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [SmallLanguageModel](https://github.com/shivendrra/SmallLanguageModel-project)
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
- Can be used to train Transformer model/BPE tokenizers
- Also for learning and research purposes
- whatever you can think of, do whatever the fuck you want.
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
Used to train a 76million parameter transformer model.
[Github repo](https://github.com/shivendrra/SmallLanguageModel-project)
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
Not suitable for finetuning any base model or pre-trained models. Only NLP and base model training from scratch.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
I'll add some finetuning data and then will update this section
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
I wanted to create an app that would help me write script for my youtube videos. I fucked around a little with gpt-3.5 finetuning and langchain, and Youtube/Google APIs and got an idea to make a model and train it from scratch, all by myself.
[Youtube video](https://youtu.be/PVpyN_2z5II?si=Q1yl-sVp8kxaGyre)
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
Youtube Videos:
-podcasts like Lex Fridman's, Waveform, Joe Rogan, vergecast, bill gates, etc.
-videos from candaian lad, aevy tv, SNL, lemmino, mrwhosetheboss, johnny harris, and many more.
-news videos from vox, wallstreetjournal, newyorktimes, the guardian, etc.
-interviews from variety, wired, y-combinator, eo, etc.
-lectures from mit opencourseware, cs50, freecodecamp, crashcourse, etc.
-tech and science from kurzgesagt, real engineering, arvin ash, vsause, veritasium, etc.
Britannica.com:
-articles on various topics like Covid, Nuclear reactions, Antarctica, Nobel prize, Great leaders, countries, etc.
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
Used [Youtube V3 API](https://console.cloud.google.com/apis/api/youtube.googleapis.com/) to fetch video ids from a particular Youtube channel and generated a traget url. Then used [Youtube Transcript API](https://pypi.org/project/youtube-transcript-api/) to fetch transcripts from the videos and write it in a .txt file.
Made a json file containing channel ids of around 45channels and fetched transcipts from around 167K videos
Webscrapping data was generated using webscrapper that scrapped data from britannica.com and some sites that were fetched by GoogleCustomSearch API.
[More Information Needed](https://medium.com/@shivendrra_/build-your-own-llm-using-youtube-transcript-data-87c04469c5e2) | Shivendrra/YouTubeTranscriptData | [
"task_categories:text-generation",
"task_categories:summarization",
"size_categories:1B<n<10B",
"size_categories:100M<n<1B",
"language:en",
"language:hi",
"language:ja",
"language:fr",
"textdataset",
"text",
"youtube",
"webscrapped data",
"youtube transcripts",
"llm training",
"transformer models",
"region:us"
] | 2024-02-03T05:55:31+00:00 | {"language": ["en", "hi", "ja", "fr"], "size_categories": ["1B<n<10B", "100M<n<1B"], "task_categories": ["text-generation", "summarization"], "tags": ["textdataset", "text", "youtube", "webscrapped data", "youtube transcripts", "llm training", "transformer models"]} | 2024-02-03T15:24:18+00:00 | [] | [
"en",
"hi",
"ja",
"fr"
] | TAGS
#task_categories-text-generation #task_categories-summarization #size_categories-1B<n<10B #size_categories-100M<n<1B #language-English #language-Hindi #language-Japanese #language-French #textdataset #text #youtube #webscrapped data #youtube transcripts #llm training #transformer models #region-us
|
# Dataset Card for YouTubeTranscriptData
## Dataset Details
### Dataset Description
This dataset contains transcripts of around 167K youtube videos that include coding lectures, podcasts, interviews, news videos, commentary and song lyrics. Also there are multiple files that have been generated using webscrapping.
- Curated by: Shivendra Singh
- License: [none]
### Dataset Sources
- Repository: SmallLanguageModel
- Demo [optional]:
## Uses
- Can be used to train Transformer model/BPE tokenizers
- Also for learning and research purposes
- whatever you can think of, do whatever the fuck you want.
### Direct Use
Used to train a 76million parameter transformer model.
Github repo
### Out-of-Scope Use
Not suitable for finetuning any base model or pre-trained models. Only NLP and base model training from scratch.
## Dataset Structure
I'll add some finetuning data and then will update this section
## Dataset Creation
### Curation Rationale
I wanted to create an app that would help me write script for my youtube videos. I fucked around a little with gpt-3.5 finetuning and langchain, and Youtube/Google APIs and got an idea to make a model and train it from scratch, all by myself.
Youtube video
### Source Data
Youtube Videos:
-podcasts like Lex Fridman's, Waveform, Joe Rogan, vergecast, bill gates, etc.
-videos from candaian lad, aevy tv, SNL, lemmino, mrwhosetheboss, johnny harris, and many more.
-news videos from vox, wallstreetjournal, newyorktimes, the guardian, etc.
-interviews from variety, wired, y-combinator, eo, etc.
-lectures from mit opencourseware, cs50, freecodecamp, crashcourse, etc.
-tech and science from kurzgesagt, real engineering, arvin ash, vsause, veritasium, etc.
URL:
-articles on various topics like Covid, Nuclear reactions, Antarctica, Nobel prize, Great leaders, countries, etc.
#### Data Collection and Processing
Used Youtube V3 API to fetch video ids from a particular Youtube channel and generated a traget url. Then used Youtube Transcript API to fetch transcripts from the videos and write it in a .txt file.
Made a json file containing channel ids of around 45channels and fetched transcipts from around 167K videos
Webscrapping data was generated using webscrapper that scrapped data from URL and some sites that were fetched by GoogleCustomSearch API.
| [
"# Dataset Card for YouTubeTranscriptData",
"## Dataset Details",
"### Dataset Description\n\n\nThis dataset contains transcripts of around 167K youtube videos that include coding lectures, podcasts, interviews, news videos, commentary and song lyrics. Also there are multiple files that have been generated using webscrapping.\n\n\n\n- Curated by: Shivendra Singh\n- License: [none]",
"### Dataset Sources\n\n\n\n- Repository: SmallLanguageModel\n- Demo [optional]:",
"## Uses\n\n\n- Can be used to train Transformer model/BPE tokenizers\n- Also for learning and research purposes\n- whatever you can think of, do whatever the fuck you want.",
"### Direct Use\n\n\nUsed to train a 76million parameter transformer model.\n\nGithub repo",
"### Out-of-Scope Use\n\n\nNot suitable for finetuning any base model or pre-trained models. Only NLP and base model training from scratch.",
"## Dataset Structure\n\n\nI'll add some finetuning data and then will update this section",
"## Dataset Creation",
"### Curation Rationale\n\n\nI wanted to create an app that would help me write script for my youtube videos. I fucked around a little with gpt-3.5 finetuning and langchain, and Youtube/Google APIs and got an idea to make a model and train it from scratch, all by myself.\n\nYoutube video",
"### Source Data\n\n\nYoutube Videos: \n\n-podcasts like Lex Fridman's, Waveform, Joe Rogan, vergecast, bill gates, etc.\n-videos from candaian lad, aevy tv, SNL, lemmino, mrwhosetheboss, johnny harris, and many more.\n-news videos from vox, wallstreetjournal, newyorktimes, the guardian, etc.\n-interviews from variety, wired, y-combinator, eo, etc.\n-lectures from mit opencourseware, cs50, freecodecamp, crashcourse, etc.\n-tech and science from kurzgesagt, real engineering, arvin ash, vsause, veritasium, etc.\n\nURL:\n-articles on various topics like Covid, Nuclear reactions, Antarctica, Nobel prize, Great leaders, countries, etc.",
"#### Data Collection and Processing\n\n\nUsed Youtube V3 API to fetch video ids from a particular Youtube channel and generated a traget url. Then used Youtube Transcript API to fetch transcripts from the videos and write it in a .txt file.\nMade a json file containing channel ids of around 45channels and fetched transcipts from around 167K videos\n\nWebscrapping data was generated using webscrapper that scrapped data from URL and some sites that were fetched by GoogleCustomSearch API."
] | [
"TAGS\n#task_categories-text-generation #task_categories-summarization #size_categories-1B<n<10B #size_categories-100M<n<1B #language-English #language-Hindi #language-Japanese #language-French #textdataset #text #youtube #webscrapped data #youtube transcripts #llm training #transformer models #region-us \n",
"# Dataset Card for YouTubeTranscriptData",
"## Dataset Details",
"### Dataset Description\n\n\nThis dataset contains transcripts of around 167K youtube videos that include coding lectures, podcasts, interviews, news videos, commentary and song lyrics. Also there are multiple files that have been generated using webscrapping.\n\n\n\n- Curated by: Shivendra Singh\n- License: [none]",
"### Dataset Sources\n\n\n\n- Repository: SmallLanguageModel\n- Demo [optional]:",
"## Uses\n\n\n- Can be used to train Transformer model/BPE tokenizers\n- Also for learning and research purposes\n- whatever you can think of, do whatever the fuck you want.",
"### Direct Use\n\n\nUsed to train a 76million parameter transformer model.\n\nGithub repo",
"### Out-of-Scope Use\n\n\nNot suitable for finetuning any base model or pre-trained models. Only NLP and base model training from scratch.",
"## Dataset Structure\n\n\nI'll add some finetuning data and then will update this section",
"## Dataset Creation",
"### Curation Rationale\n\n\nI wanted to create an app that would help me write script for my youtube videos. I fucked around a little with gpt-3.5 finetuning and langchain, and Youtube/Google APIs and got an idea to make a model and train it from scratch, all by myself.\n\nYoutube video",
"### Source Data\n\n\nYoutube Videos: \n\n-podcasts like Lex Fridman's, Waveform, Joe Rogan, vergecast, bill gates, etc.\n-videos from candaian lad, aevy tv, SNL, lemmino, mrwhosetheboss, johnny harris, and many more.\n-news videos from vox, wallstreetjournal, newyorktimes, the guardian, etc.\n-interviews from variety, wired, y-combinator, eo, etc.\n-lectures from mit opencourseware, cs50, freecodecamp, crashcourse, etc.\n-tech and science from kurzgesagt, real engineering, arvin ash, vsause, veritasium, etc.\n\nURL:\n-articles on various topics like Covid, Nuclear reactions, Antarctica, Nobel prize, Great leaders, countries, etc.",
"#### Data Collection and Processing\n\n\nUsed Youtube V3 API to fetch video ids from a particular Youtube channel and generated a traget url. Then used Youtube Transcript API to fetch transcripts from the videos and write it in a .txt file.\nMade a json file containing channel ids of around 45channels and fetched transcipts from around 167K videos\n\nWebscrapping data was generated using webscrapper that scrapped data from URL and some sites that were fetched by GoogleCustomSearch API."
] |
f1e4b2d7e2e67e5689eea8bf9f440accaae29492 | # nli pairs
j = datasets.load_dataset("andersonbcdefg/jina_negation_v2", split="train").select_columns(["query", "pos"])
syn = datasets.load_dataset("andersonbcdefg/synthetic_nli_combined_mnli_filtered", split="train").select_columns(["query", "pos"])
a = datasets.load_dataset("andersonbcdefg/anli_triples", split="train").select_columns(["query", "pos"])
sim = datasets.load_dataset("andersonbcdefg/simcse_nli", split="train").select_columns(["query", "pos"])
doc = datasets.load_dataset("andersonbcdefg/doc_nli_pos_pairs", split="train")
fever = datasets.load_dataset("pietrolesci/nli_fever", split="train").filter(lambda x: x["label"] == 0).map(
lambda x: {"query": x["premise"], "pos": x["hypothesis"]}
).select_columns(["query", "pos"])
ling = datasets.load_dataset("metaeval/lingnli", split="train").filter(lambda x: x["label"] == "entailment").map(
lambda x: {"query": x["premise"], "pos": x["hypothesis"]}
).select_columns(["query", "pos"]) | andersonbcdefg/nli_pairs_v2 | [
"region:us"
] | 2024-02-03T06:21:57+00:00 | {"dataset_info": {"features": [{"name": "query", "dtype": "string"}, {"name": "pos", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 729187184.8374316, "num_examples": 903218}], "download_size": 204019551, "dataset_size": 729187184.8374316}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T19:28:51+00:00 | [] | [] | TAGS
#region-us
| # nli pairs
j = datasets.load_dataset("andersonbcdefg/jina_negation_v2", split="train").select_columns(["query", "pos"])
syn = datasets.load_dataset("andersonbcdefg/synthetic_nli_combined_mnli_filtered", split="train").select_columns(["query", "pos"])
a = datasets.load_dataset("andersonbcdefg/anli_triples", split="train").select_columns(["query", "pos"])
sim = datasets.load_dataset("andersonbcdefg/simcse_nli", split="train").select_columns(["query", "pos"])
doc = datasets.load_dataset("andersonbcdefg/doc_nli_pos_pairs", split="train")
fever = datasets.load_dataset("pietrolesci/nli_fever", split="train").filter(lambda x: x["label"] == 0).map(
lambda x: {"query": x["premise"], "pos": x["hypothesis"]}
).select_columns(["query", "pos"])
ling = datasets.load_dataset("metaeval/lingnli", split="train").filter(lambda x: x["label"] == "entailment").map(
lambda x: {"query": x["premise"], "pos": x["hypothesis"]}
).select_columns(["query", "pos"]) | [
"# nli pairs\nj = datasets.load_dataset(\"andersonbcdefg/jina_negation_v2\", split=\"train\").select_columns([\"query\", \"pos\"])\nsyn = datasets.load_dataset(\"andersonbcdefg/synthetic_nli_combined_mnli_filtered\", split=\"train\").select_columns([\"query\", \"pos\"])\na = datasets.load_dataset(\"andersonbcdefg/anli_triples\", split=\"train\").select_columns([\"query\", \"pos\"])\nsim = datasets.load_dataset(\"andersonbcdefg/simcse_nli\", split=\"train\").select_columns([\"query\", \"pos\"])\ndoc = datasets.load_dataset(\"andersonbcdefg/doc_nli_pos_pairs\", split=\"train\")\nfever = datasets.load_dataset(\"pietrolesci/nli_fever\", split=\"train\").filter(lambda x: x[\"label\"] == 0).map(\n lambda x: {\"query\": x[\"premise\"], \"pos\": x[\"hypothesis\"]}\n).select_columns([\"query\", \"pos\"])\nling = datasets.load_dataset(\"metaeval/lingnli\", split=\"train\").filter(lambda x: x[\"label\"] == \"entailment\").map(\n lambda x: {\"query\": x[\"premise\"], \"pos\": x[\"hypothesis\"]}\n).select_columns([\"query\", \"pos\"])"
] | [
"TAGS\n#region-us \n",
"# nli pairs\nj = datasets.load_dataset(\"andersonbcdefg/jina_negation_v2\", split=\"train\").select_columns([\"query\", \"pos\"])\nsyn = datasets.load_dataset(\"andersonbcdefg/synthetic_nli_combined_mnli_filtered\", split=\"train\").select_columns([\"query\", \"pos\"])\na = datasets.load_dataset(\"andersonbcdefg/anli_triples\", split=\"train\").select_columns([\"query\", \"pos\"])\nsim = datasets.load_dataset(\"andersonbcdefg/simcse_nli\", split=\"train\").select_columns([\"query\", \"pos\"])\ndoc = datasets.load_dataset(\"andersonbcdefg/doc_nli_pos_pairs\", split=\"train\")\nfever = datasets.load_dataset(\"pietrolesci/nli_fever\", split=\"train\").filter(lambda x: x[\"label\"] == 0).map(\n lambda x: {\"query\": x[\"premise\"], \"pos\": x[\"hypothesis\"]}\n).select_columns([\"query\", \"pos\"])\nling = datasets.load_dataset(\"metaeval/lingnli\", split=\"train\").filter(lambda x: x[\"label\"] == \"entailment\").map(\n lambda x: {\"query\": x[\"premise\"], \"pos\": x[\"hypothesis\"]}\n).select_columns([\"query\", \"pos\"])"
] |
8da3ebc53941fb46652c313e0b2391ad5ad41af9 | # Dataset: Korpus-Merger
한국어 LLM 학습을 위한 거대 한국어-영어 corpus 데이터셋.
# Dataset Info
## Basic information
**# of rows:** About 65.1M.
**Expected tokens:** About 20B.
**Expected download size:** About 25GB.
**Data processing period:** About 2 weeks.
## Which datasets?
We used 50 datasets through **[AIhub](https://www.aihub.or.kr/)**. And, we also used **행정안전부(private)** dataset.
[Details info](https://huggingface.co/datasets/MarkrAI/Korpus-Merger/blob/main/Corpus_lists).
## Info about each dataset
```
name: expertise_1
num_examples: 1379830
name: expertise_2
num_examples: 1033716
name: common_knowledge
num_examples: 6216545
name: MRC
num_examples: 486812
name: SNS
num_examples: 6548177
name: KoEn
num_examples: 32131380
name: conversation
num_examples: 1175803
name: book
num_examples: 3760358
name: etc
num_examples: 9077808
name: etc_plus
num_examples: 3338651
```
## Preprocessing
```
1. 문장 마지막에 최대한 `마침표`가 올 수 있도록 전처리.
- 문서, 논문, 특허 등등에 대해서 적용.
2. 대화 및 SNS와 같은 구어체 형식의 데이터셋은 그대로 유지.
3. 다국어 데이터셋에 대하여 영어를 제외한 나머지 언어는 삭제함.
- 정규표현식 이용.
4. 한국어와 영어 데이터셋을 각각의 행으로 구분하여 사용.
5. Empty value('')를 처리함.
```
If you want to see about multilingual datasets, click [this](https://huggingface.co/datasets/CorpuSlave/multilingual).
In this case, firstly remove `KoEn` dataset, and add `multilingual` dataset into `MarkrAI/Ko-Corpus-Merger`.
## License
The license is **cc-by-nc-sa-4.0**.
Also, this included `행정안전부(private) and AIhub` dataset, so we cannot share this corpus.
We recommend that manually download the datasets.
## Merger list
[CorpuSlave/expertise_1](https://huggingface.co/datasets/CorpuSlave/expertise_1).
[CorpuSlave/expertise_2](https://huggingface.co/datasets/CorpuSlave/expertise_2).
[CorpuSlave/common-knowledge](https://huggingface.co/datasets/CorpuSlave/common-knowledge).
[CorpuSlave/MRC](https://huggingface.co/datasets/CorpuSlave/MRC).
[CorpuSlave/SNS](https://huggingface.co/datasets/CorpuSlave/SNS).
[CorpuSlave/KoEn](https://huggingface.co/datasets/CorpuSlave/KoEn).
[CorpuSlave/conversation](https://huggingface.co/datasets/CorpuSlave/conversation).
[CorpuSlave/book](https://huggingface.co/datasets/CorpuSlave/book).
[CorpuSlave/etc](https://huggingface.co/datasets/CorpuSlave/etc).
[CorpuSlave/etc_plus](https://huggingface.co/datasets/CorpuSlave/etc_plus).
- [Datasets collection](https://huggingface.co/collections/CorpuSlave/korean-corpus-lists-65bde1fedaa6f482566793de).
## Reference
[beomi's corpus](https://huggingface.co/beomi/OPEN-SOLAR-KO-10.7B/tree/main/corpus).
| MarkrAI/Korpus-Merger | [
"size_categories:10M<n<100M",
"language:ko",
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-02-03T06:51:12+00:00 | {"language": ["ko", "en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["10M<n<100M"], "configs": [{"config_name": "expertise_1", "data_files": [{"split": "train", "path": "expertise_1/train-*"}]}, {"config_name": "expertise_2", "data_files": [{"split": "train", "path": "expertise_2/train-*"}]}, {"config_name": "common_knowledge", "data_files": [{"split": "train", "path": "common_knowledge/train-*"}]}, {"config_name": "MRC", "data_files": [{"split": "train", "path": "MRC/train-*"}]}, {"config_name": "SNS", "data_files": [{"split": "train", "path": "SNS/train-*"}]}, {"config_name": "KoEn", "data_files": [{"split": "train", "path": "KoEn/train-*"}]}, {"config_name": "conversation", "data_files": [{"split": "train", "path": "conversation/train-*"}]}, {"config_name": "book", "data_files": [{"split": "train", "path": "book/train-*"}]}, {"config_name": "etc", "data_files": [{"split": "train", "path": "etc/train-*"}]}, {"config_name": "etc_plus", "data_files": [{"split": "train", "path": "etc_plus/train-*"}]}], "dataset_info": [{"config_name": "expertise_1", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8027905291, "num_examples": 1379830}], "download_size": 3070150673, "dataset_size": 8027905291}, {"config_name": "expertise_2", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19649919857, "num_examples": 1033716}], "download_size": 7150263602, "dataset_size": 19649919857}, {"config_name": "common_knowledge", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10650689076, "num_examples": 6216545}], "download_size": 5826173649, "dataset_size": 10650689076}, {"config_name": "MRC", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 804002756, "num_examples": 486812}], "download_size": 424720173, "dataset_size": 804002756}, {"config_name": "SNS", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1362629645, "num_examples": 6548177}], "download_size": 682242647, "dataset_size": 1362629645}, {"config_name": "KoEn", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6028010433, "num_examples": 32131380}], "download_size": 2633226968, "dataset_size": 6028010433}, {"config_name": "conversation", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 727147995, "num_examples": 1175803}], "download_size": 334681572, "dataset_size": 727147995}, {"config_name": "book", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2960895627, "num_examples": 3760358}], "download_size": 1607840975, "dataset_size": 2960895627}, {"config_name": "etc", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1994571164, "num_examples": 9077808}], "download_size": 989107824, "dataset_size": 1994571164}, {"config_name": "etc_plus", "features": [{"name": "text", "dtype": "string"}, {"name": "doc_id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3606378832, "num_examples": 3338651}], "download_size": 2302324264, "dataset_size": 3606378832}]} | 2024-02-05T07:43:18+00:00 | [] | [
"ko",
"en"
] | TAGS
#size_categories-10M<n<100M #language-Korean #language-English #license-cc-by-nc-sa-4.0 #region-us
| # Dataset: Korpus-Merger
한국어 LLM 학습을 위한 거대 한국어-영어 corpus 데이터셋.
# Dataset Info
## Basic information
# of rows: About 65.1M.
Expected tokens: About 20B.
Expected download size: About 25GB.
Data processing period: About 2 weeks.
## Which datasets?
We used 50 datasets through AIhub. And, we also used 행정안전부(private) dataset.
Details info.
## Info about each dataset
## Preprocessing
If you want to see about multilingual datasets, click this.
In this case, firstly remove 'KoEn' dataset, and add 'multilingual' dataset into 'MarkrAI/Ko-Corpus-Merger'.
## License
The license is cc-by-nc-sa-4.0.
Also, this included '행정안전부(private) and AIhub' dataset, so we cannot share this corpus.
We recommend that manually download the datasets.
## Merger list
CorpuSlave/expertise_1.
CorpuSlave/expertise_2.
CorpuSlave/common-knowledge.
CorpuSlave/MRC.
CorpuSlave/SNS.
CorpuSlave/KoEn.
CorpuSlave/conversation.
CorpuSlave/book.
CorpuSlave/etc.
CorpuSlave/etc_plus.
- Datasets collection.
## Reference
beomi's corpus.
| [
"# Dataset: Korpus-Merger\n한국어 LLM 학습을 위한 거대 한국어-영어 corpus 데이터셋.",
"# Dataset Info",
"## Basic information",
"# of rows: About 65.1M. \nExpected tokens: About 20B. \nExpected download size: About 25GB. \nData processing period: About 2 weeks.",
"## Which datasets?\nWe used 50 datasets through AIhub. And, we also used 행정안전부(private) dataset. \nDetails info.",
"## Info about each dataset",
"## Preprocessing\n \nIf you want to see about multilingual datasets, click this. \nIn this case, firstly remove 'KoEn' dataset, and add 'multilingual' dataset into 'MarkrAI/Ko-Corpus-Merger'.",
"## License\nThe license is cc-by-nc-sa-4.0. \nAlso, this included '행정안전부(private) and AIhub' dataset, so we cannot share this corpus. \nWe recommend that manually download the datasets.",
"## Merger list\nCorpuSlave/expertise_1. \nCorpuSlave/expertise_2. \nCorpuSlave/common-knowledge. \nCorpuSlave/MRC. \nCorpuSlave/SNS. \nCorpuSlave/KoEn. \nCorpuSlave/conversation. \nCorpuSlave/book. \nCorpuSlave/etc. \nCorpuSlave/etc_plus. \n\n- Datasets collection.",
"## Reference\nbeomi's corpus."
] | [
"TAGS\n#size_categories-10M<n<100M #language-Korean #language-English #license-cc-by-nc-sa-4.0 #region-us \n",
"# Dataset: Korpus-Merger\n한국어 LLM 학습을 위한 거대 한국어-영어 corpus 데이터셋.",
"# Dataset Info",
"## Basic information",
"# of rows: About 65.1M. \nExpected tokens: About 20B. \nExpected download size: About 25GB. \nData processing period: About 2 weeks.",
"## Which datasets?\nWe used 50 datasets through AIhub. And, we also used 행정안전부(private) dataset. \nDetails info.",
"## Info about each dataset",
"## Preprocessing\n \nIf you want to see about multilingual datasets, click this. \nIn this case, firstly remove 'KoEn' dataset, and add 'multilingual' dataset into 'MarkrAI/Ko-Corpus-Merger'.",
"## License\nThe license is cc-by-nc-sa-4.0. \nAlso, this included '행정안전부(private) and AIhub' dataset, so we cannot share this corpus. \nWe recommend that manually download the datasets.",
"## Merger list\nCorpuSlave/expertise_1. \nCorpuSlave/expertise_2. \nCorpuSlave/common-knowledge. \nCorpuSlave/MRC. \nCorpuSlave/SNS. \nCorpuSlave/KoEn. \nCorpuSlave/conversation. \nCorpuSlave/book. \nCorpuSlave/etc. \nCorpuSlave/etc_plus. \n\n- Datasets collection.",
"## Reference\nbeomi's corpus."
] |
096841382a4065abf5a27cc012201726ae9f90ef | # Dataset Card for "HumanEval_mbpp_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | rookielixinye/HumanEval_mbpp_format | [
"region:us"
] | 2024-02-03T07:17:50+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "task_id", "dtype": "string"}, {"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 58366, "num_examples": 164}], "download_size": 24961, "dataset_size": 58366}} | 2024-02-03T07:17:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "HumanEval_mbpp_format"
More Information needed | [
"# Dataset Card for \"HumanEval_mbpp_format\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"HumanEval_mbpp_format\"\n\nMore Information needed"
] |
5ba02a1cace538a93d902c3ae3c2d28b8bbf6ec9 |
# Dataset Card for Evaluation run of Vasanth/Beast-Soul-new
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Vasanth/Beast-Soul-new](https://huggingface.co/Vasanth/Beast-Soul-new) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Vasanth__Beast-Soul-new",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T07:23:20.760411](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul-new/blob/main/results_2024-02-03T07-23-20.760411.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6536990566207631,
"acc_stderr": 0.03208057193564134,
"acc_norm": 0.6528661054036955,
"acc_norm_stderr": 0.03275608995369063,
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6738214693586763,
"mc2_stderr": 0.015349612490988648
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725225,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710696
},
"harness|hellaswag|10": {
"acc": 0.7182832105158335,
"acc_stderr": 0.004489166767430656,
"acc_norm": 0.8834893447520414,
"acc_norm_stderr": 0.003201805872737069
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990334,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990334
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604103,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.02389187954195961,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.02389187954195961
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342511,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342511
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.543451652386781,
"mc1_stderr": 0.017437280953183688,
"mc2": 0.6738214693586763,
"mc2_stderr": 0.015349612490988648
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.00996871576547965
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Vasanth__Beast-Soul-new | [
"region:us"
] | 2024-02-03T07:25:42+00:00 | {"pretty_name": "Evaluation run of Vasanth/Beast-Soul-new", "dataset_summary": "Dataset automatically created during the evaluation run of model [Vasanth/Beast-Soul-new](https://huggingface.co/Vasanth/Beast-Soul-new) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Vasanth__Beast-Soul-new\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T07:23:20.760411](https://huggingface.co/datasets/open-llm-leaderboard/details_Vasanth__Beast-Soul-new/blob/main/results_2024-02-03T07-23-20.760411.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6536990566207631,\n \"acc_stderr\": 0.03208057193564134,\n \"acc_norm\": 0.6528661054036955,\n \"acc_norm_stderr\": 0.03275608995369063,\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6738214693586763,\n \"mc2_stderr\": 0.015349612490988648\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725225,\n \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710696\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7182832105158335,\n \"acc_stderr\": 0.004489166767430656,\n \"acc_norm\": 0.8834893447520414,\n \"acc_norm_stderr\": 0.003201805872737069\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990334,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990334\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n \"acc_stderr\": 0.01655860163604103,\n \"acc_norm\": 0.4301675977653631,\n \"acc_norm_stderr\": 0.01655860163604103\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.02389187954195961,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.02389187954195961\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342511,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342511\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.543451652386781,\n \"mc1_stderr\": 0.017437280953183688,\n \"mc2\": 0.6738214693586763,\n \"mc2_stderr\": 0.015349612490988648\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.00996871576547965\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \"acc_stderr\": 0.012652544133186143\n }\n}\n```", "repo_url": "https://huggingface.co/Vasanth/Beast-Soul-new", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|arc:challenge|25_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|gsm8k|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hellaswag|10_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T07-23-20.760411.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["**/details_harness|winogrande|5_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T07-23-20.760411.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T07_23_20.760411", "path": ["results_2024-02-03T07-23-20.760411.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T07-23-20.760411.parquet"]}]}]} | 2024-02-03T07:26:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Vasanth/Beast-Soul-new
Dataset automatically created during the evaluation run of model Vasanth/Beast-Soul-new on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T07:23:20.760411(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Vasanth/Beast-Soul-new\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Beast-Soul-new on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T07:23:20.760411(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Vasanth/Beast-Soul-new\n\n\n\nDataset automatically created during the evaluation run of model Vasanth/Beast-Soul-new on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T07:23:20.760411(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
765360cb298ba5e8aa09061829e2a7846eecf494 |
# Dataset Card for CantoMap
## Dataset Description
- **Homepage:** https://github.com/gwinterstein/CantoMap/
- **Repository:** https://github.com/gwinterstein/CantoMap/
- **Paper:** http://www.lrec-conf.org/proceedings/lrec2020/pdf/2020.lrec-1.355.pdf
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the [Languages](https://commonvoice.mozilla.org/en/languages) page to request a language or start contributing.
### Languages
```
Cantonese
```
## How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Cantonese config, simply specify the corresponding language config name (i.e., "yue" for Cantonese):
```python
from datasets import load_dataset
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train", streaming=True)
print(next(iter(cv_16)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
### Local
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
batch_sampler = BatchSampler(RandomSampler(cv_16), batch_size=32, drop_last=False)
dataloader = DataLoader(cv_16, batch_sampler=batch_sampler)
```
### Streaming
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
cv_16 = load_dataset("safecantonese/cantomap", "yue", split="train")
dataloader = DataLoader(cv_16, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on CantoMap with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file and its `sentence`.
```python
{
'path': 'et/clips/common_voice_et_18318995.mp3',
'audio': {
'path': 'et/clips/common_voice_et_18318995.mp3',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000
},
'sentence': 'Tasub kokku saada inimestega, keda tunned juba ammust ajast saati.',
}
```
### Data Fields
`path` (`string`): The path to the audio file
`audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`sentence` (`string`): The sentence the user was prompted to speak
### Data Splits
The speech material has been subdivided into portions for train and test.
## Additional Information
### Licensing Information
gpl-3.0
### Citation Information
```
@inproceedings{lrec:2020,
author = {Winterstein, Grégoire, Tang, Carmen and Lai, Regine},
title = {CantoMap: a Hong Kong Cantonese MapTask Corpus}
}
```
| safecantonese/cantomap | [
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"multilinguality:monolingual",
"language:yue",
"license:gpl-3.0",
"region:us"
] | 2024-02-03T08:14:51+00:00 | {"annotations_creators": ["crowdsourced"], "language_creators": ["crowdsourced"], "language": ["yue"], "license": ["gpl-3.0"], "multilinguality": ["monolingual"], "pretty_name": "CantoMap"} | 2024-02-03T15:19:34+00:00 | [] | [
"yue"
] | TAGS
#annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #language-Yue Chinese #license-gpl-3.0 #region-us
|
# Dataset Card for CantoMap
## Dataset Description
- Homepage: URL
- Repository: URL
- Paper: URL
### Dataset Summary
The Common Voice dataset consists of a unique MP3 and corresponding text file.
Many of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent
that can help improve the accuracy of speech recognition engines.
The dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added.
Take a look at the Languages page to request a language or start contributing.
### Languages
## How to use
The 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function.
For example, to download the Cantonese config, simply specify the corresponding language config name (i.e., "yue" for Cantonese):
Using the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).
### Local
### Streaming
To find out more about loading and preparing audio datasets, head over to URL
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on CantoMap with 'transformers' - here.
## Dataset Structure
### Data Instances
A typical data point comprises the 'path' to the audio file and its 'sentence'.
### Data Fields
'path' ('string'): The path to the audio file
'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0]["audio"]' the audio file is automatically decoded and resampled to 'dataset.features["audio"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '"audio"' column, *i.e.* 'dataset[0]["audio"]' should always be preferred over 'dataset["audio"][0]'.
'sentence' ('string'): The sentence the user was prompted to speak
### Data Splits
The speech material has been subdivided into portions for train and test.
## Additional Information
### Licensing Information
gpl-3.0
| [
"# Dataset Card for CantoMap",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Cantonese config, simply specify the corresponding language config name (i.e., \"yue\" for Cantonese):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on CantoMap with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'.",
"### Data Fields\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak",
"### Data Splits\n\nThe speech material has been subdivided into portions for train and test.",
"## Additional Information",
"### Licensing Information\n\ngpl-3.0"
] | [
"TAGS\n#annotations_creators-crowdsourced #language_creators-crowdsourced #multilinguality-monolingual #language-Yue Chinese #license-gpl-3.0 #region-us \n",
"# Dataset Card for CantoMap",
"## Dataset Description\n\n- Homepage: URL\n- Repository: URL\n- Paper: URL",
"### Dataset Summary\n\nThe Common Voice dataset consists of a unique MP3 and corresponding text file. \nMany of the 30328 recorded hours in the dataset also include demographic metadata like age, sex, and accent \nthat can help improve the accuracy of speech recognition engines.\n\nThe dataset currently consists of 19673 validated hours in 120 languages, but more voices and languages are always added. \nTake a look at the Languages page to request a language or start contributing.",
"### Languages",
"## How to use\n\nThe 'datasets' library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the 'load_dataset' function. \n\nFor example, to download the Cantonese config, simply specify the corresponding language config name (i.e., \"yue\" for Cantonese):\n\n\nUsing the datasets library, you can also stream the dataset on-the-fly by adding a 'streaming=True' argument to the 'load_dataset' function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.\n\n\n*Bonus*: create a PyTorch dataloader directly with your own datasets (local/streamed).",
"### Local",
"### Streaming\n\n\n\nTo find out more about loading and preparing audio datasets, head over to URL",
"### Example scripts\n\nTrain your own CTC or Seq2Seq Automatic Speech Recognition models on CantoMap with 'transformers' - here.",
"## Dataset Structure",
"### Data Instances\n\nA typical data point comprises the 'path' to the audio file and its 'sentence'.",
"### Data Fields\n\n'path' ('string'): The path to the audio file\n\n'audio' ('dict'): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: 'dataset[0][\"audio\"]' the audio file is automatically decoded and resampled to 'dataset.features[\"audio\"].sampling_rate'. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the '\"audio\"' column, *i.e.* 'dataset[0][\"audio\"]' should always be preferred over 'dataset[\"audio\"][0]'.\n\n'sentence' ('string'): The sentence the user was prompted to speak",
"### Data Splits\n\nThe speech material has been subdivided into portions for train and test.",
"## Additional Information",
"### Licensing Information\n\ngpl-3.0"
] |
c4343ca2fc853d36c199606ffba307cd3fb33a9a | # Dataset Card for "dpo_data_capy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | pvduy/dpo_data_capy | [
"region:us"
] | 2024-02-03T08:14:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 264301004, "num_examples": 45600}, {"name": "test", "num_bytes": 8556760, "num_examples": 1964}], "download_size": 148360235, "dataset_size": 272857764}} | 2024-02-03T08:15:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dpo_data_capy"
More Information needed | [
"# Dataset Card for \"dpo_data_capy\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dpo_data_capy\"\n\nMore Information needed"
] |
19dfef906d7cda493491c9c8fc6abeee49daad13 | Contains 27k sentence pairs for Nepali Spell Checking.
https://www.kaggle.com/code/amardura/thegroup-nep-spell-synthetic-datapoints | duraad/nep-spell-synthetic-27k | [
"size_categories:10K<n<100K",
"language:ne",
"license:mit",
"nepali",
"spelling",
"region:us"
] | 2024-02-03T08:16:28+00:00 | {"language": ["ne"], "license": "mit", "size_categories": ["10K<n<100K"], "tags": ["nepali", "spelling"]} | 2024-02-03T08:24:31+00:00 | [] | [
"ne"
] | TAGS
#size_categories-10K<n<100K #language-Nepali (macrolanguage) #license-mit #nepali #spelling #region-us
| Contains 27k sentence pairs for Nepali Spell Checking.
URL | [] | [
"TAGS\n#size_categories-10K<n<100K #language-Nepali (macrolanguage) #license-mit #nepali #spelling #region-us \n"
] |
d78944e36331eaec98a8c795b528d0a2f2effdac | # Dataset Card for "sanskrit-russian-short"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | lingtrain/sanskrit-russian-short | [
"region:us"
] | 2024-02-03T08:33:46+00:00 | {"dataset_info": {"features": [{"name": "ru", "dtype": "string"}, {"name": "san", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15746614, "num_examples": 36131}], "download_size": 8244708, "dataset_size": 15746614}} | 2024-02-03T08:33:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "sanskrit-russian-short"
More Information needed | [
"# Dataset Card for \"sanskrit-russian-short\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"sanskrit-russian-short\"\n\nMore Information needed"
] |
056af020885202195bfeac0c3be1a8716fc95407 | # Dataset Card for "multiturn_chat_0.8m-chinese-zhtw"
## 內容
包含約 80 萬條由 [BELLE](https://github.com/LianjiaTech/BELLE) 專案所產生的 *user* 與 *assistant* 的多輪對話。
注意:此資料集是由 ChatGPT 產生的,未經嚴格校驗,內容可能包含錯誤。使用過程中請注意這一點。
## 限制和使用限制
我們要求開發者僅將我們開源的程式碼、資料、模型及後續衍生物用於研究目的,不得用於商業,以及其他會對社會帶來危害的用途。
由於數據是由*ChatGPT*產生的,未經嚴格驗證,在事實性和其他方面仍有一些不足之處。因此,在使用此資料集時,請務必注意甄別。
本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集帶來的任何損害、糾紛,本專案的開發者不承擔任何責任。
***
# Multiturn Chat 0.8M
## Contents
Includes approx. 0.8M Chinese multiturn dialogs between *human* and *assistant*.
Note: this subset was generated by *ChatGPT* and was not strictly verified. The dialog contents might contain errors. Please take this in mind when using this subset.
**instruction** contains history dialog context, distinguishable by *Human:* and *Assistant:*, **output** contains the current reply by *assistant*.
| benchang1110/multiturn_chat_0.8m-chinese-zhtw | [
"region:us"
] | 2024-02-03T08:39:46+00:00 | {"dataset_info": {"features": [{"name": "conversations", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 969083052, "num_examples": 831036}], "download_size": 561072214, "dataset_size": 969083052}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T08:56:44+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "multiturn_chat_0.8m-chinese-zhtw"
## 內容
包含約 80 萬條由 BELLE 專案所產生的 *user* 與 *assistant* 的多輪對話。
注意:此資料集是由 ChatGPT 產生的,未經嚴格校驗,內容可能包含錯誤。使用過程中請注意這一點。
## 限制和使用限制
我們要求開發者僅將我們開源的程式碼、資料、模型及後續衍生物用於研究目的,不得用於商業,以及其他會對社會帶來危害的用途。
由於數據是由*ChatGPT*產生的,未經嚴格驗證,在事實性和其他方面仍有一些不足之處。因此,在使用此資料集時,請務必注意甄別。
本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集帶來的任何損害、糾紛,本專案的開發者不承擔任何責任。
*
# Multiturn Chat 0.8M
## Contents
Includes approx. 0.8M Chinese multiturn dialogs between *human* and *assistant*.
Note: this subset was generated by *ChatGPT* and was not strictly verified. The dialog contents might contain errors. Please take this in mind when using this subset.
instruction contains history dialog context, distinguishable by *Human:* and *Assistant:*, output contains the current reply by *assistant*.
| [
"# Dataset Card for \"multiturn_chat_0.8m-chinese-zhtw\"",
"## 內容\n\n包含約 80 萬條由 BELLE 專案所產生的 *user* 與 *assistant* 的多輪對話。\n\n注意:此資料集是由 ChatGPT 產生的,未經嚴格校驗,內容可能包含錯誤。使用過程中請注意這一點。",
"## 限制和使用限制\n我們要求開發者僅將我們開源的程式碼、資料、模型及後續衍生物用於研究目的,不得用於商業,以及其他會對社會帶來危害的用途。\n\n由於數據是由*ChatGPT*產生的,未經嚴格驗證,在事實性和其他方面仍有一些不足之處。因此,在使用此資料集時,請務必注意甄別。\n\n本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集帶來的任何損害、糾紛,本專案的開發者不承擔任何責任。\n\n\n\n*",
"# Multiturn Chat 0.8M",
"## Contents\nIncludes approx. 0.8M Chinese multiturn dialogs between *human* and *assistant*.\nNote: this subset was generated by *ChatGPT* and was not strictly verified. The dialog contents might contain errors. Please take this in mind when using this subset.\ninstruction contains history dialog context, distinguishable by *Human:* and *Assistant:*, output contains the current reply by *assistant*."
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"multiturn_chat_0.8m-chinese-zhtw\"",
"## 內容\n\n包含約 80 萬條由 BELLE 專案所產生的 *user* 與 *assistant* 的多輪對話。\n\n注意:此資料集是由 ChatGPT 產生的,未經嚴格校驗,內容可能包含錯誤。使用過程中請注意這一點。",
"## 限制和使用限制\n我們要求開發者僅將我們開源的程式碼、資料、模型及後續衍生物用於研究目的,不得用於商業,以及其他會對社會帶來危害的用途。\n\n由於數據是由*ChatGPT*產生的,未經嚴格驗證,在事實性和其他方面仍有一些不足之處。因此,在使用此資料集時,請務必注意甄別。\n\n本資料集不代表任何一方的立場、利益或想法,無關任何團體的任何類型的主張。因使用本資料集帶來的任何損害、糾紛,本專案的開發者不承擔任何責任。\n\n\n\n*",
"# Multiturn Chat 0.8M",
"## Contents\nIncludes approx. 0.8M Chinese multiturn dialogs between *human* and *assistant*.\nNote: this subset was generated by *ChatGPT* and was not strictly verified. The dialog contents might contain errors. Please take this in mind when using this subset.\ninstruction contains history dialog context, distinguishable by *Human:* and *Assistant:*, output contains the current reply by *assistant*."
] |
4d988b4db9265fa6e75966137d1a2954d43ade20 | created a total of 50 images
jlbaker361/ddpo-stability std: 0.3403565585613251 mean: 3.9143659400939943
jlbaker361/ddpo-stability-dcgan std: 0.2904449701309204 mean: 3.8854822635650637 | jlbaker361/stability-ddpo-evaluation-0-uncond | [
"region:us"
] | 2024-02-03T08:46:20+00:00 | {} | 2024-02-03T08:46:32+00:00 | [] | [] | TAGS
#region-us
| created a total of 50 images
jlbaker361/ddpo-stability std: 0.3403565585613251 mean: 3.9143659400939943
jlbaker361/ddpo-stability-dcgan std: 0.2904449701309204 mean: 3.8854822635650637 | [] | [
"TAGS\n#region-us \n"
] |
442fba93972d9cc642b9bbdafd123740e02d3250 |
# Dataset Card for "MELD_Text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | zrr1999/MELD_Text | [
"region:us"
] | 2024-02-03T09:16:56+00:00 | {"dataset_info": {"config_name": "MELD_Text", "features": [{"name": "text", "dtype": "string"}, {"name": "emotion", "dtype": {"class_label": {"names": {"0": "neutral", "1": "joy", "2": "sadness", "3": "anger", "4": "fear", "5": "disgust", "6": "surprise"}}}}, {"name": "sentiment", "dtype": {"class_label": {"names": {"0": "neutral", "1": "positive", "2": "negative"}}}}], "splits": [{"name": "train", "num_bytes": 608623, "num_examples": 9989}, {"name": "validation", "num_bytes": 67287, "num_examples": 1109}, {"name": "test", "num_bytes": 162746, "num_examples": 2610}], "download_size": 1516414, "dataset_size": 838656}} | 2024-02-06T12:23:26+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for "MELD_Text"
More Information needed | [
"# Dataset Card for \"MELD_Text\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"MELD_Text\"\n\nMore Information needed"
] |
6ae2b32752ee6c3172d3fe6dd230be1f5031d3ef | # Dataset Card for "infographic-sections-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | McSpicyWithMilo/infographic-sections-0.3split | [
"region:us"
] | 2024-02-03T09:28:12+00:00 | {"dataset_info": {"features": [{"name": "instruction_type", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "infographic_section", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 30370, "num_examples": 280}, {"name": "test", "num_bytes": 12584, "num_examples": 120}], "download_size": 20369, "dataset_size": 42954}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-03T09:28:22+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "infographic-sections-0.3split"
More Information needed | [
"# Dataset Card for \"infographic-sections-0.3split\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"infographic-sections-0.3split\"\n\nMore Information needed"
] |
1dff352e9b0037b25503423d767ba2be8bd0d271 | # Dataset Card for "instruction-types-0.3split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | McSpicyWithMilo/instruction-types-0.3split | [
"region:us"
] | 2024-02-03T09:28:34+00:00 | {"dataset_info": {"features": [{"name": "instruction_type", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 24468, "num_examples": 280}, {"name": "test", "num_bytes": 10561, "num_examples": 120}], "download_size": 18875, "dataset_size": 35029}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-02-03T09:28:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "instruction-types-0.3split"
More Information needed | [
"# Dataset Card for \"instruction-types-0.3split\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"instruction-types-0.3split\"\n\nMore Information needed"
] |
0cc856863fdffc6c2579a53bb4df1db9b12f5166 | duraad/thegroup-datafile-31k | [
"size_categories:10K<n<100K",
"language:ne",
"license:mit",
"region:us"
] | 2024-02-03T09:46:24+00:00 | {"language": ["ne"], "license": "mit", "size_categories": ["10K<n<100K"]} | 2024-02-03T09:47:44+00:00 | [] | [
"ne"
] | TAGS
#size_categories-10K<n<100K #language-Nepali (macrolanguage) #license-mit #region-us
| [] | [
"TAGS\n#size_categories-10K<n<100K #language-Nepali (macrolanguage) #license-mit #region-us \n"
] |
||
d98a6e240cf70a0b5239c669de13e06386db5103 | Filtered, >8word | arkanbima/js-en-id | [
"size_categories:1M<n<10M",
"language:id",
"license:mit",
"region:us"
] | 2024-02-03T10:04:38+00:00 | {"language": ["id"], "license": "mit", "size_categories": ["1M<n<10M"]} | 2024-02-03T10:37:42+00:00 | [] | [
"id"
] | TAGS
#size_categories-1M<n<10M #language-Indonesian #license-mit #region-us
| Filtered, >8word | [] | [
"TAGS\n#size_categories-1M<n<10M #language-Indonesian #license-mit #region-us \n"
] |
0c2f7738477425f695aeee72a09058e820701fdd |
# Dataset Card for Evaluation run of xriminact/TarsMeta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [xriminact/TarsMeta](https://huggingface.co/xriminact/TarsMeta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xriminact__TarsMeta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T10:31:02.204074](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsMeta/blob/main/results_2024-02-03T10-31-02.204074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5302500196137852,
"acc_stderr": 0.03442759984251229,
"acc_norm": 0.5307250558716725,
"acc_norm_stderr": 0.03513598960374511,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135018,
"mc2": 0.4787756504539016,
"mc2_stderr": 0.015592159506417262
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.014609263165632179,
"acc_norm": 0.5290102389078498,
"acc_norm_stderr": 0.014586776355294314
},
"harness|hellaswag|10": {
"acc": 0.5958972316271659,
"acc_stderr": 0.004897146690596255,
"acc_norm": 0.7820155347540331,
"acc_norm_stderr": 0.00412032979668366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5460526315789473,
"acc_stderr": 0.04051646342874142,
"acc_norm": 0.5460526315789473,
"acc_norm_stderr": 0.04051646342874142
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994325,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994325
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6565656565656566,
"acc_stderr": 0.03383201223244441,
"acc_norm": 0.6565656565656566,
"acc_norm_stderr": 0.03383201223244441
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700304,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700304
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807247,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807247
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7302752293577982,
"acc_stderr": 0.019028486711115438,
"acc_norm": 0.7302752293577982,
"acc_norm_stderr": 0.019028486711115438
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502326,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502326
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236433,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236433
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6413502109704642,
"acc_stderr": 0.031219569445301843,
"acc_norm": 0.6413502109704642,
"acc_norm_stderr": 0.031219569445301843
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212095,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212095
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.037466683254700206,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.037466683254700206
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196687,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196687
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6717752234993615,
"acc_stderr": 0.016791685640192892,
"acc_norm": 0.6717752234993615,
"acc_norm_stderr": 0.016791685640192892
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364562,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.028431095444176643,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.028431095444176643
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.02798268045975956,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.02798268045975956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.027794760105008736,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.027794760105008736
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281285,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281285
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3709256844850065,
"acc_stderr": 0.012337391684530312,
"acc_norm": 0.3709256844850065,
"acc_norm_stderr": 0.012337391684530312
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213507,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213507
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.02022686271003946,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.02022686271003946
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.563265306122449,
"acc_stderr": 0.031751952375833226,
"acc_norm": 0.563265306122449,
"acc_norm_stderr": 0.031751952375833226
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.016419874731135018,
"mc2": 0.4787756504539016,
"mc2_stderr": 0.015592159506417262
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453937
},
"harness|gsm8k|5": {
"acc": 0.5284306292645944,
"acc_stderr": 0.013750202076584419
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_xriminact__TarsMeta | [
"region:us"
] | 2024-02-03T10:33:17+00:00 | {"pretty_name": "Evaluation run of xriminact/TarsMeta", "dataset_summary": "Dataset automatically created during the evaluation run of model [xriminact/TarsMeta](https://huggingface.co/xriminact/TarsMeta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xriminact__TarsMeta\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T10:31:02.204074](https://huggingface.co/datasets/open-llm-leaderboard/details_xriminact__TarsMeta/blob/main/results_2024-02-03T10-31-02.204074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5302500196137852,\n \"acc_stderr\": 0.03442759984251229,\n \"acc_norm\": 0.5307250558716725,\n \"acc_norm_stderr\": 0.03513598960374511,\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135018,\n \"mc2\": 0.4787756504539016,\n \"mc2_stderr\": 0.015592159506417262\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632179,\n \"acc_norm\": 0.5290102389078498,\n \"acc_norm_stderr\": 0.014586776355294314\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5958972316271659,\n \"acc_stderr\": 0.004897146690596255,\n \"acc_norm\": 0.7820155347540331,\n \"acc_norm_stderr\": 0.00412032979668366\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5460526315789473,\n \"acc_stderr\": 0.04051646342874142,\n \"acc_norm\": 0.5460526315789473,\n \"acc_norm_stderr\": 0.04051646342874142\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n \"acc_stderr\": 0.02737987122994325,\n \"acc_norm\": 0.635483870967742,\n \"acc_norm_stderr\": 0.02737987122994325\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6565656565656566,\n \"acc_stderr\": 0.03383201223244441,\n \"acc_norm\": 0.6565656565656566,\n \"acc_norm_stderr\": 0.03383201223244441\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700304,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700304\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807247,\n \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807247\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7302752293577982,\n \"acc_stderr\": 0.019028486711115438,\n \"acc_norm\": 0.7302752293577982,\n \"acc_norm_stderr\": 0.019028486711115438\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502326,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502326\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236433,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236433\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6413502109704642,\n \"acc_stderr\": 0.031219569445301843,\n \"acc_norm\": 0.6413502109704642,\n \"acc_norm_stderr\": 0.031219569445301843\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.5336322869955157,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212095,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212095\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.037466683254700206,\n \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.037466683254700206\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196687,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196687\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6717752234993615,\n \"acc_stderr\": 0.016791685640192892,\n \"acc_norm\": 0.6717752234993615,\n \"acc_norm_stderr\": 0.016791685640192892\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5317919075144508,\n \"acc_stderr\": 0.026864624366756646,\n \"acc_norm\": 0.5317919075144508,\n \"acc_norm_stderr\": 0.026864624366756646\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n \"acc_stderr\": 0.014696599650364562,\n \"acc_norm\": 0.26145251396648045,\n \"acc_norm_stderr\": 0.014696599650364562\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.028431095444176643,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.028431095444176643\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n \"acc_stderr\": 0.02798268045975956,\n \"acc_norm\": 0.5852090032154341,\n \"acc_norm_stderr\": 0.02798268045975956\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.027794760105008736,\n \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.027794760105008736\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281285,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281285\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3709256844850065,\n \"acc_stderr\": 0.012337391684530312,\n \"acc_norm\": 0.3709256844850065,\n \"acc_norm_stderr\": 0.012337391684530312\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213507,\n \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213507\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5049019607843137,\n \"acc_stderr\": 0.02022686271003946,\n \"acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.02022686271003946\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.563265306122449,\n \"acc_stderr\": 0.031751952375833226,\n \"acc_norm\": 0.563265306122449,\n \"acc_norm_stderr\": 0.031751952375833226\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n \"mc1_stderr\": 0.016419874731135018,\n \"mc2\": 0.4787756504539016,\n \"mc2_stderr\": 0.015592159506417262\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453937\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5284306292645944,\n \"acc_stderr\": 0.013750202076584419\n }\n}\n```", "repo_url": "https://huggingface.co/xriminact/TarsMeta", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|arc:challenge|25_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|gsm8k|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hellaswag|10_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T10-31-02.204074.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["**/details_harness|winogrande|5_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T10-31-02.204074.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T10_31_02.204074", "path": ["results_2024-02-03T10-31-02.204074.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T10-31-02.204074.parquet"]}]}]} | 2024-02-03T10:33:38+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of xriminact/TarsMeta
Dataset automatically created during the evaluation run of model xriminact/TarsMeta on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T10:31:02.204074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of xriminact/TarsMeta\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsMeta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T10:31:02.204074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of xriminact/TarsMeta\n\n\n\nDataset automatically created during the evaluation run of model xriminact/TarsMeta on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T10:31:02.204074(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
e50c3eb022cbac4fc66ebc5f682f4c39e6b01eb8 | # Dataset Card for "wizard-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | DataGuard/wizard-en | [
"region:eu"
] | 2024-02-03T10:37:53+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 335577067, "num_examples": 143000}], "download_size": 160896543, "dataset_size": 335577067}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T10:38:16+00:00 | [] | [] | TAGS
#region-eu
| # Dataset Card for "wizard-en"
More Information needed | [
"# Dataset Card for \"wizard-en\"\n\nMore Information needed"
] | [
"TAGS\n#region-eu \n",
"# Dataset Card for \"wizard-en\"\n\nMore Information needed"
] |
67bca7b9708c33adddf1d83a6d0d42f880cf4fbe |
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | alpha-wolf-jin/cis-01 | [
"region:us"
] | 2024-02-03T10:57:56+00:00 | {} | 2024-02-03T11:04:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Dataset Name
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name\n\n\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
59115815083cc156d272ccd24293f4f8a6ae7e4e | # Dataset Card for "Tashkeela"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | khalidalt/Tashkeela | [
"region:us"
] | 2024-02-03T11:21:04+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "text_no_taskheel", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1591938210.245426, "num_examples": 1592319}], "download_size": 726281863, "dataset_size": 1591938210.245426}} | 2024-02-03T11:25:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "Tashkeela"
More Information needed | [
"# Dataset Card for \"Tashkeela\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"Tashkeela\"\n\nMore Information needed"
] |
cbc95141278d0fc908c96c28c7603aee00e5b251 | # Dataset Card for "plantbert_fill_mask_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CesarLeblanc/plantbert_fill_mask_dataset | [
"region:us"
] | 2024-02-03T11:28:59+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 211358912, "num_examples": 572231}], "download_size": 54856722, "dataset_size": 211358912}} | 2024-02-03T11:29:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "plantbert_fill_mask_dataset"
More Information needed | [
"# Dataset Card for \"plantbert_fill_mask_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"plantbert_fill_mask_dataset\"\n\nMore Information needed"
] |
491be0f3c533918633fdcce26e1cc024782a4190 |
# Dataset Card for Evaluation run of cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO](https://huggingface.co/cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T11:35:18.964075](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO/blob/main/results_2024-02-03T11-35-18.964075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7527363709875337,
"acc_stderr": 0.028711415120135725,
"acc_norm": 0.7558124417156407,
"acc_norm_stderr": 0.029268003615455822,
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.7277883751034597,
"mc2_stderr": 0.014040395362394884
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.01294203019513643
},
"harness|hellaswag|10": {
"acc": 0.6624178450507867,
"acc_stderr": 0.004719187890948062,
"acc_norm": 0.8610834495120494,
"acc_norm_stderr": 0.003451525868724678
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848062,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848062
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5,
"acc_stderr": 0.04975185951049946,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04975185951049946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7617021276595745,
"acc_stderr": 0.02785125297388977,
"acc_norm": 0.7617021276595745,
"acc_norm_stderr": 0.02785125297388977
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8774193548387097,
"acc_stderr": 0.018656720991789413,
"acc_norm": 0.8774193548387097,
"acc_norm_stderr": 0.018656720991789413
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6354679802955665,
"acc_stderr": 0.0338640574606209,
"acc_norm": 0.6354679802955665,
"acc_norm_stderr": 0.0338640574606209
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066584,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066584
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476442,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476442
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7794871794871795,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.7794871794871795,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.030343862998512626,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.030343862998512626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673957,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673957
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116241,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116241
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9019607843137255,
"acc_stderr": 0.020871118455552097,
"acc_norm": 0.9019607843137255,
"acc_norm_stderr": 0.020871118455552097
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065522,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065522
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7668161434977578,
"acc_stderr": 0.028380391147094702,
"acc_norm": 0.7668161434977578,
"acc_norm_stderr": 0.028380391147094702
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.03145703854306251,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.03145703854306251
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.031766839486404054,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.031766839486404054
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9444444444444444,
"acc_stderr": 0.015006312806446912,
"acc_norm": 0.9444444444444444,
"acc_norm_stderr": 0.015006312806446912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9016602809706258,
"acc_stderr": 0.01064835630187633,
"acc_norm": 0.9016602809706258,
"acc_norm_stderr": 0.01064835630187633
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.02139396140436385,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.02139396140436385
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7184357541899441,
"acc_stderr": 0.015042290171866136,
"acc_norm": 0.7184357541899441,
"acc_norm_stderr": 0.015042290171866136
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8562091503267973,
"acc_stderr": 0.020091188936043707,
"acc_norm": 0.8562091503267973,
"acc_norm_stderr": 0.020091188936043707
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.0232227567974351,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.0232227567974351
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.019242526226544543,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.019242526226544543
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.599290780141844,
"acc_stderr": 0.0292334657455731,
"acc_norm": 0.599290780141844,
"acc_norm_stderr": 0.0292334657455731
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5853976531942634,
"acc_stderr": 0.012582597058908284,
"acc_norm": 0.5853976531942634,
"acc_norm_stderr": 0.012582597058908284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02315746830855934,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02315746830855934
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273337,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273337
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546198,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546198
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199173,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199173
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9239766081871345,
"acc_stderr": 0.020327297744388385,
"acc_norm": 0.9239766081871345,
"acc_norm_stderr": 0.020327297744388385
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5630354957160343,
"mc1_stderr": 0.017363844503195957,
"mc2": 0.7277883751034597,
"mc2_stderr": 0.014040395362394884
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825935
},
"harness|gsm8k|5": {
"acc": 0.7119029567854435,
"acc_stderr": 0.01247446973719792
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_cloudyu__4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO | [
"region:us"
] | 2024-02-03T11:37:30+00:00 | {"pretty_name": "Evaluation run of cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO](https://huggingface.co/cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T11:35:18.964075](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO/blob/main/results_2024-02-03T11-35-18.964075.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7527363709875337,\n \"acc_stderr\": 0.028711415120135725,\n \"acc_norm\": 0.7558124417156407,\n \"acc_norm_stderr\": 0.029268003615455822,\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.7277883751034597,\n \"mc2_stderr\": 0.014040395362394884\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.01294203019513643\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6624178450507867,\n \"acc_stderr\": 0.004719187890948062,\n \"acc_norm\": 0.8610834495120494,\n \"acc_norm_stderr\": 0.003451525868724678\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.026280550932848062,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.026280550932848062\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7617021276595745,\n \"acc_stderr\": 0.02785125297388977,\n \"acc_norm\": 0.7617021276595745,\n \"acc_norm_stderr\": 0.02785125297388977\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8774193548387097,\n \"acc_stderr\": 0.018656720991789413,\n \"acc_norm\": 0.8774193548387097,\n \"acc_norm_stderr\": 0.018656720991789413\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6354679802955665,\n \"acc_stderr\": 0.0338640574606209,\n \"acc_norm\": 0.6354679802955665,\n \"acc_norm_stderr\": 0.0338640574606209\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066584,\n \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066584\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476442,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476442\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7794871794871795,\n \"acc_stderr\": 0.02102067268082791,\n \"acc_norm\": 0.7794871794871795,\n \"acc_norm_stderr\": 0.02102067268082791\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.45185185185185184,\n \"acc_stderr\": 0.030343862998512626,\n \"acc_norm\": 0.45185185185185184,\n \"acc_norm_stderr\": 0.030343862998512626\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673957,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673957\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116241,\n \"acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116241\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.03167468706828979,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.03167468706828979\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9019607843137255,\n \"acc_stderr\": 0.020871118455552097,\n \"acc_norm\": 0.9019607843137255,\n \"acc_norm_stderr\": 0.020871118455552097\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065522,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065522\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7668161434977578,\n \"acc_stderr\": 0.028380391147094702,\n \"acc_norm\": 0.7668161434977578,\n \"acc_norm_stderr\": 0.028380391147094702\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.03145703854306251,\n \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.03145703854306251\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9444444444444444,\n \"acc_stderr\": 0.015006312806446912,\n \"acc_norm\": 0.9444444444444444,\n \"acc_norm_stderr\": 0.015006312806446912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9016602809706258,\n \"acc_stderr\": 0.01064835630187633,\n \"acc_norm\": 0.9016602809706258,\n \"acc_norm_stderr\": 0.01064835630187633\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.02139396140436385,\n \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.02139396140436385\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7184357541899441,\n \"acc_stderr\": 0.015042290171866136,\n \"acc_norm\": 0.7184357541899441,\n \"acc_norm_stderr\": 0.015042290171866136\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8562091503267973,\n \"acc_stderr\": 0.020091188936043707,\n \"acc_norm\": 0.8562091503267973,\n \"acc_norm_stderr\": 0.020091188936043707\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n \"acc_stderr\": 0.0232227567974351,\n \"acc_norm\": 0.7877813504823151,\n \"acc_norm_stderr\": 0.0232227567974351\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8611111111111112,\n \"acc_stderr\": 0.019242526226544543,\n \"acc_norm\": 0.8611111111111112,\n \"acc_norm_stderr\": 0.019242526226544543\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.599290780141844,\n \"acc_stderr\": 0.0292334657455731,\n \"acc_norm\": 0.599290780141844,\n \"acc_norm_stderr\": 0.0292334657455731\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5853976531942634,\n \"acc_stderr\": 0.012582597058908284,\n \"acc_norm\": 0.5853976531942634,\n \"acc_norm_stderr\": 0.012582597058908284\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02315746830855934,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02315746830855934\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273337,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273337\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546198,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546198\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n \"acc_stderr\": 0.019675343217199173,\n \"acc_norm\": 0.9154228855721394,\n \"acc_norm_stderr\": 0.019675343217199173\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.9239766081871345,\n \"acc_stderr\": 0.020327297744388385,\n \"acc_norm\": 0.9239766081871345,\n \"acc_norm_stderr\": 0.020327297744388385\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5630354957160343,\n \"mc1_stderr\": 0.017363844503195957,\n \"mc2\": 0.7277883751034597,\n \"mc2_stderr\": 0.014040395362394884\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825935\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7119029567854435,\n \"acc_stderr\": 0.01247446973719792\n }\n}\n```", "repo_url": "https://huggingface.co/cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|arc:challenge|25_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|gsm8k|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hellaswag|10_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T11-35-18.964075.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["**/details_harness|winogrande|5_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T11-35-18.964075.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T11_35_18.964075", "path": ["results_2024-02-03T11-35-18.964075.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T11-35-18.964075.parquet"]}]}]} | 2024-02-03T11:37:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO
Dataset automatically created during the evaluation run of model cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T11:35:18.964075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T11:35:18.964075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO\n\n\n\nDataset automatically created during the evaluation run of model cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T11:35:18.964075(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
cf0d43cb963d4b13111ca791d12706818b236b05 | # Dataset Card for "plantbert_text_classification_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | CesarLeblanc/plantbert_text_classification_dataset | [
"region:us"
] | 2024-02-03T11:44:17+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "fold_0", "path": "data/fold_0-*"}, {"split": "fold_1", "path": "data/fold_1-*"}, {"split": "fold_2", "path": "data/fold_2-*"}, {"split": "fold_3", "path": "data/fold_3-*"}, {"split": "fold_4", "path": "data/fold_4-*"}, {"split": "fold_5", "path": "data/fold_5-*"}, {"split": "fold_6", "path": "data/fold_6-*"}, {"split": "fold_7", "path": "data/fold_7-*"}, {"split": "fold_8", "path": "data/fold_8-*"}, {"split": "fold_9", "path": "data/fold_9-*"}]}], "dataset_info": {"features": [{"name": "label", "dtype": {"class_label": {"names": {"0": "MA211", "1": "MA221", "2": "MA222", "3": "MA223", "4": "MA224", "5": "MA225", "6": "MA232", "7": "MA241", "8": "MA251", "9": "MA252", "10": "MA253", "11": "N11", "12": "N12", "13": "N13", "14": "N14", "15": "N15", "16": "N16", "17": "N17", "18": "N18", "19": "N19", "20": "N1A", "21": "N1B", "22": "N1C", "23": "N1D", "24": "N1E", "25": "N1F", "26": "N1G", "27": "N1H", "28": "N1J", "29": "N21", "30": "N22", "31": "N31", "32": "N32", "33": "N33", "34": "N34", "35": "N35", "36": "Q11", "37": "Q12", "38": "Q21", "39": "Q22", "40": "Q23", "41": "Q24", "42": "Q25", "43": "Q41", "44": "Q42", "45": "Q43", "46": "Q44", "47": "Q45", "48": "Q46", "49": "Q51", "50": "Q52", "51": "Q53", "52": "Q54", "53": "R11", "54": "R12", "55": "R13", "56": "R14", "57": "R15", "58": "R16", "59": "R17", "60": "R18", "61": "R19", "62": "R1A", "63": "R1B", "64": "R1C", "65": "R1D", "66": "R1E", "67": "R1F", "68": "R1G", "69": "R1H", "70": "R1J", "71": "R1K", "72": "R1M", "73": "R1P", "74": "R1Q", "75": "R1R", "76": "R1S", "77": "R21", "78": "R22", "79": "R23", "80": "R24", "81": "R31", "82": "R32", "83": "R33", "84": "R34", "85": "R35", "86": "R36", "87": "R37", "88": "R41", "89": "R42", "90": "R43", "91": "R44", "92": "R45", "93": "R51", "94": "R52", "95": "R53", "96": "R54", "97": "R55", "98": "R56", "99": "R57", "100": "R61", "101": "R62", "102": "R63", "103": "R64", "104": "R65", "105": "S11", "106": "S12", "107": "S21", "108": "S22", "109": "S23", "110": "S24", "111": "S25", "112": "S26", "113": "S31", "114": "S32", "115": "S33", "116": "S34", "117": "S35", "118": "S36", "119": "S37", "120": "S38", "121": "S41", "122": "S42", "123": "S51", "124": "S52", "125": "S53", "126": "S54", "127": "S61", "128": "S62", "129": "S63", "130": "S64", "131": "S65", "132": "S66", "133": "S67", "134": "S68", "135": "S71", "136": "S72", "137": "S73", "138": "S74", "139": "S75", "140": "S76", "141": "S81", "142": "S82", "143": "S91", "144": "S92", "145": "S93", "146": "S94", "147": "T11", "148": "T12", "149": "T13", "150": "T14", "151": "T15", "152": "T16", "153": "T17", "154": "T18", "155": "T19", "156": "T1A", "157": "T1B", "158": "T1C", "159": "T1D", "160": "T1E", "161": "T1F", "162": "T1G", "163": "T1H", "164": "T21", "165": "T22", "166": "T23", "167": "T24", "168": "T25", "169": "T27", "170": "T28", "171": "T29", "172": "T31", "173": "T32", "174": "T33", "175": "T34", "176": "T35", "177": "T36", "178": "T37", "179": "T38", "180": "T39", "181": "T3A", "182": "T3B", "183": "T3C", "184": "T3D", "185": "T3E", "186": "T3F", "187": "T3G", "188": "T3H", "189": "T3J", "190": "T3K", "191": "T3M", "192": "U21", "193": "U22", "194": "U23", "195": "U24", "196": "U25", "197": "U26", "198": "U27", "199": "U28", "200": "U29", "201": "U2A", "202": "U32", "203": "U33", "204": "U34", "205": "U35", "206": "U36", "207": "U37", "208": "U38", "209": "U3A", "210": "U3B", "211": "U3C", "212": "U3D", "213": "U61", "214": "U62", "215": "V11", "216": "V12", "217": "V13", "218": "V14", "219": "V15", "220": "V32", "221": "V33", "222": "V34", "223": "V35", "224": "V37", "225": "V38", "226": "V39"}}}}, {"name": "text", "dtype": "string"}], "splits": [{"name": "fold_0", "num_bytes": 37135896, "num_examples": 85087}, {"name": "fold_1", "num_bytes": 36025033, "num_examples": 85076}, {"name": "fold_2", "num_bytes": 35613576, "num_examples": 85115}, {"name": "fold_3", "num_bytes": 36680348, "num_examples": 85067}, {"name": "fold_4", "num_bytes": 36877319, "num_examples": 85065}, {"name": "fold_5", "num_bytes": 36029591, "num_examples": 85081}, {"name": "fold_6", "num_bytes": 36277596, "num_examples": 85148}, {"name": "fold_7", "num_bytes": 36033390, "num_examples": 85082}, {"name": "fold_8", "num_bytes": 36234393, "num_examples": 85053}, {"name": "fold_9", "num_bytes": 35973208, "num_examples": 85159}], "download_size": 98159513, "dataset_size": 362880350}} | 2024-02-03T11:44:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "plantbert_text_classification_dataset"
More Information needed | [
"# Dataset Card for \"plantbert_text_classification_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"plantbert_text_classification_dataset\"\n\nMore Information needed"
] |
ad8006393fdbfa7753013c85bf243c067a23338e |
# Usage
When downloading, specify which files you want to download and set the split to `train` (required by `datasets`).
```python
from datasets import load_dataset
words = load_dataset("fairnlp/weat", data_files=["words.parquet"], split="train")
associations = load_dataset("fairnlp/weat", data_files=["associations_weat.parquet"], split="train")
```
# Dataset Card for Word Embedding Association Test (WEAT)
This dataset contains the source words of the original Word Embedding Association Test (WEAT) as
described [by Caliskan et. al. (2016)](https://arxiv.org/abs/1608.07187).
## Dataset Details
The dataset contains word lists and attribute lists used to compute several WEAT scores for different embedding
associations. For details on the methodology, please refer to the original paper. This dataset is contributed to Hugging
Face as part of the WEAT implementation in the [FairNLP `fairscore` library](https://github.com/FairNLP/fairscore/).
### Dataset Sources
- **Paper [optional]:** lcs.bath.ac.uk/~jjb/ftp/CaliskanSemantics-Arxiv.pdf
**BibTeX:**
```bibtex
@article{DBLP:journals/corr/IslamBN16,
author = {Aylin Caliskan Islam and
Joanna J. Bryson and
Arvind Narayanan},
title = {Semantics derived automatically from language corpora necessarily
contain human biases},
journal = {CoRR},
volume = {abs/1608.07187},
year = {2016},
url = {http://arxiv.org/abs/1608.07187},
eprinttype = {arXiv},
eprint = {1608.07187},
timestamp = {Sat, 23 Jan 2021 01:20:12 +0100},
biburl = {https://dblp.org/rec/journals/corr/IslamBN16.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
| fairnlp/weat | [
"language:en",
"arxiv:1608.07187",
"region:us"
] | 2024-02-03T12:36:02+00:00 | {"language": ["en"], "configs": [{"config_name": "words", "data_files": [{"split": "words", "path": "words.parquet"}]}, {"config_name": "associations", "data_files": [{"split": "associations_weat", "path": "associations_weat.parquet"}]}, {"config_name": "associations_wefat", "data_files": [{"split": "associations_wefat", "path": "associations_wefat.parquet"}]}]} | 2024-02-03T12:36:06+00:00 | [
"1608.07187"
] | [
"en"
] | TAGS
#language-English #arxiv-1608.07187 #region-us
|
# Usage
When downloading, specify which files you want to download and set the split to 'train' (required by 'datasets').
# Dataset Card for Word Embedding Association Test (WEAT)
This dataset contains the source words of the original Word Embedding Association Test (WEAT) as
described by Caliskan et. al. (2016).
## Dataset Details
The dataset contains word lists and attribute lists used to compute several WEAT scores for different embedding
associations. For details on the methodology, please refer to the original paper. This dataset is contributed to Hugging
Face as part of the WEAT implementation in the FairNLP 'fairscore' library.
### Dataset Sources
- Paper [optional]: URL
BibTeX:
| [
"# Usage\n\nWhen downloading, specify which files you want to download and set the split to 'train' (required by 'datasets').",
"# Dataset Card for Word Embedding Association Test (WEAT)\n\nThis dataset contains the source words of the original Word Embedding Association Test (WEAT) as\ndescribed by Caliskan et. al. (2016).",
"## Dataset Details\n\nThe dataset contains word lists and attribute lists used to compute several WEAT scores for different embedding\nassociations. For details on the methodology, please refer to the original paper. This dataset is contributed to Hugging\nFace as part of the WEAT implementation in the FairNLP 'fairscore' library.",
"### Dataset Sources\n\n- Paper [optional]: URL\n\nBibTeX:"
] | [
"TAGS\n#language-English #arxiv-1608.07187 #region-us \n",
"# Usage\n\nWhen downloading, specify which files you want to download and set the split to 'train' (required by 'datasets').",
"# Dataset Card for Word Embedding Association Test (WEAT)\n\nThis dataset contains the source words of the original Word Embedding Association Test (WEAT) as\ndescribed by Caliskan et. al. (2016).",
"## Dataset Details\n\nThe dataset contains word lists and attribute lists used to compute several WEAT scores for different embedding\nassociations. For details on the methodology, please refer to the original paper. This dataset is contributed to Hugging\nFace as part of the WEAT implementation in the FairNLP 'fairscore' library.",
"### Dataset Sources\n\n- Paper [optional]: URL\n\nBibTeX:"
] |
444aff7370eefe23e3af494f81790b2a25df6e89 | # Dataset Card for Nafsy
<!-- Provide a quick summary of the dataset. -->
This arabic dataset is a set of mental health articles. The original dataset was scrapped from [Nafsy.net](https://nafsy.net/).
## Dataset Details
**Language(s) (NLP):** Arabic
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
Fine-tuning llm for the mental health domain.
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
It is a CSV file with columns:
- content: the articles
- text_size: length of article
- topic: top 10 words that describe the topics of the article
- prob: topic prediction accuracy
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
Creating an arabic chatbot for mental health support.
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
- This dataset was originally scrapped from [Nafsy.net](https://nafsy.net/) then uploaded to Kaggle.
- An additional preprocessing was made by this repo owner:
- Cleaning data: removing urls, extra spaces, and non words, detach punctuations, and dropping duplicates
- Applying Topic Modeling to generate main topics for each article using bert-base-arabic model
- Deduplicating data using sentence-transformers (paraphrase-multilingual-MiniLM-L12-v2)
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[husamal](https://www.kaggle.com/husamal)
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
@misc{Husamal_2021, title={Arabic-physcology-dataset}, url={https://www.kaggle.com/datasets/husamal/arabicphyscologydataset?select=nafsy.csv}, journal={Kaggle}, author={Husamal}, year={2021}, month={May}}
## Dataset Card Authors
Muhammad Helmy
## Dataset Card Contact
[email protected] | MuhammadHelmy/nafsy | [
"task_categories:conversational",
"task_categories:text-generation",
"task_categories:text-classification",
"size_categories:1K<n<10K",
"language:ar",
"mental health",
"region:us"
] | 2024-02-03T12:47:57+00:00 | {"language": ["ar"], "size_categories": ["1K<n<10K"], "task_categories": ["conversational", "text-generation", "text-classification"], "tags": ["mental health"], "dataset_info": {"features": [{"name": "content", "dtype": "string"}, {"name": "text_size", "dtype": "int64"}, {"name": "topic", "dtype": "string"}, {"name": "prob", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 6007437.514440433, "num_examples": 1884}], "download_size": 2896563, "dataset_size": 6007437.514440433}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-02-08T15:07:39+00:00 | [] | [
"ar"
] | TAGS
#task_categories-conversational #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-Arabic #mental health #region-us
| # Dataset Card for Nafsy
This arabic dataset is a set of mental health articles. The original dataset was scrapped from URL.
## Dataset Details
Language(s) (NLP): Arabic
## Uses
### Direct Use
Fine-tuning llm for the mental health domain.
## Dataset Structure
It is a CSV file with columns:
- content: the articles
- text_size: length of article
- topic: top 10 words that describe the topics of the article
- prob: topic prediction accuracy
## Dataset Creation
### Curation Rationale
Creating an arabic chatbot for mental health support.
### Source Data
#### Data Collection and Processing
- This dataset was originally scrapped from URL then uploaded to Kaggle.
- An additional preprocessing was made by this repo owner:
- Cleaning data: removing urls, extra spaces, and non words, detach punctuations, and dropping duplicates
- Applying Topic Modeling to generate main topics for each article using bert-base-arabic model
- Deduplicating data using sentence-transformers (paraphrase-multilingual-MiniLM-L12-v2)
#### Who are the source data producers?
husamal
[optional]
BibTeX:
@misc{Husamal_2021, title={Arabic-physcology-dataset}, url={URL journal={Kaggle}, author={Husamal}, year={2021}, month={May}}
## Dataset Card Authors
Muhammad Helmy
## Dataset Card Contact
muhammadhelmymmo@URL | [
"# Dataset Card for Nafsy \n\n\n\nThis arabic dataset is a set of mental health articles. The original dataset was scrapped from URL.",
"## Dataset Details\n\nLanguage(s) (NLP): Arabic",
"## Uses",
"### Direct Use\n\n\n\nFine-tuning llm for the mental health domain.",
"## Dataset Structure\n\n\n\nIt is a CSV file with columns:\n- content: the articles\n- text_size: length of article\n- topic: top 10 words that describe the topics of the article\n- prob: topic prediction accuracy",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nCreating an arabic chatbot for mental health support.",
"### Source Data",
"#### Data Collection and Processing\n\n\n\n- This dataset was originally scrapped from URL then uploaded to Kaggle.\n- An additional preprocessing was made by this repo owner:\n - Cleaning data: removing urls, extra spaces, and non words, detach punctuations, and dropping duplicates\n - Applying Topic Modeling to generate main topics for each article using bert-base-arabic model\n - Deduplicating data using sentence-transformers (paraphrase-multilingual-MiniLM-L12-v2)",
"#### Who are the source data producers?\n\n\n\nhusamal\n\n[optional]\n\n\n\nBibTeX:\n\n@misc{Husamal_2021, title={Arabic-physcology-dataset}, url={URL journal={Kaggle}, author={Husamal}, year={2021}, month={May}}",
"## Dataset Card Authors\n\nMuhammad Helmy",
"## Dataset Card Contact\n\nmuhammadhelmymmo@URL"
] | [
"TAGS\n#task_categories-conversational #task_categories-text-generation #task_categories-text-classification #size_categories-1K<n<10K #language-Arabic #mental health #region-us \n",
"# Dataset Card for Nafsy \n\n\n\nThis arabic dataset is a set of mental health articles. The original dataset was scrapped from URL.",
"## Dataset Details\n\nLanguage(s) (NLP): Arabic",
"## Uses",
"### Direct Use\n\n\n\nFine-tuning llm for the mental health domain.",
"## Dataset Structure\n\n\n\nIt is a CSV file with columns:\n- content: the articles\n- text_size: length of article\n- topic: top 10 words that describe the topics of the article\n- prob: topic prediction accuracy",
"## Dataset Creation",
"### Curation Rationale\n\n\n\nCreating an arabic chatbot for mental health support.",
"### Source Data",
"#### Data Collection and Processing\n\n\n\n- This dataset was originally scrapped from URL then uploaded to Kaggle.\n- An additional preprocessing was made by this repo owner:\n - Cleaning data: removing urls, extra spaces, and non words, detach punctuations, and dropping duplicates\n - Applying Topic Modeling to generate main topics for each article using bert-base-arabic model\n - Deduplicating data using sentence-transformers (paraphrase-multilingual-MiniLM-L12-v2)",
"#### Who are the source data producers?\n\n\n\nhusamal\n\n[optional]\n\n\n\nBibTeX:\n\n@misc{Husamal_2021, title={Arabic-physcology-dataset}, url={URL journal={Kaggle}, author={Husamal}, year={2021}, month={May}}",
"## Dataset Card Authors\n\nMuhammad Helmy",
"## Dataset Card Contact\n\nmuhammadhelmymmo@URL"
] |
ff9235faafedb97e30c2fc52ca01982e6ae6ec3c |
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Mistral-7b-v3](https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T13:09:15.907579](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3/blob/main/results_2024-02-03T13-09-15.907579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6442912881257177,
"acc_stderr": 0.03218732611252043,
"acc_norm": 0.6442770991903772,
"acc_norm_stderr": 0.03285689028348496,
"mc1": 0.42717258261933905,
"mc1_stderr": 0.01731683441096393,
"mc2": 0.6004582343641779,
"mc2_stderr": 0.015252431767364912
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175458,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6605257916749652,
"acc_stderr": 0.004725630911520326,
"acc_norm": 0.8547102170882295,
"acc_norm_stderr": 0.003516725751784838
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645358,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645358
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374294,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374294
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601432,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601432
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066295,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066295
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.01632906107320745,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.01632906107320745
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02492200116888633,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02492200116888633
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701766,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701766
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.019312676065786554,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.019312676065786554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42717258261933905,
"mc1_stderr": 0.01731683441096393,
"mc2": 0.6004582343641779,
"mc2_stderr": 0.015252431767364912
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.01056902112282591
},
"harness|gsm8k|5": {
"acc": 0.6762699014404853,
"acc_stderr": 0.012888247397371143
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3 | [
"region:us"
] | 2024-02-03T13:11:33+00:00 | {"pretty_name": "Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [gmonsoon/OpenMia-Indo-Mistral-7b-v3](https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T13:09:15.907579](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__OpenMia-Indo-Mistral-7b-v3/blob/main/results_2024-02-03T13-09-15.907579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6442912881257177,\n \"acc_stderr\": 0.03218732611252043,\n \"acc_norm\": 0.6442770991903772,\n \"acc_norm_stderr\": 0.03285689028348496,\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.01731683441096393,\n \"mc2\": 0.6004582343641779,\n \"mc2_stderr\": 0.015252431767364912\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175458,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6605257916749652,\n \"acc_stderr\": 0.004725630911520326,\n \"acc_norm\": 0.8547102170882295,\n \"acc_norm_stderr\": 0.003516725751784838\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645358,\n \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645358\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374294,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374294\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601432,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601432\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066295,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066295\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n \"acc_stderr\": 0.01632906107320745,\n \"acc_norm\": 0.39217877094972065,\n \"acc_norm_stderr\": 0.01632906107320745\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.02492200116888633,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.02492200116888633\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n \"acc_stderr\": 0.012718456618701766,\n \"acc_norm\": 0.455019556714472,\n \"acc_norm_stderr\": 0.012718456618701766\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6486928104575164,\n \"acc_stderr\": 0.019312676065786554,\n \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.019312676065786554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42717258261933905,\n \"mc1_stderr\": 0.01731683441096393,\n \"mc2\": 0.6004582343641779,\n \"mc2_stderr\": 0.015252431767364912\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.01056902112282591\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6762699014404853,\n \"acc_stderr\": 0.012888247397371143\n }\n}\n```", "repo_url": "https://huggingface.co/gmonsoon/OpenMia-Indo-Mistral-7b-v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-09-15.907579.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["**/details_harness|winogrande|5_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T13-09-15.907579.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T13_09_15.907579", "path": ["results_2024-02-03T13-09-15.907579.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T13-09-15.907579.parquet"]}]}]} | 2024-02-03T13:11:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3
Dataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T13:09:15.907579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:09:15.907579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gmonsoon/OpenMia-Indo-Mistral-7b-v3\n\n\n\nDataset automatically created during the evaluation run of model gmonsoon/OpenMia-Indo-Mistral-7b-v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:09:15.907579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
3b87b011bbffc8d5317c363d6e451db394ac1c4d |
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/clown-SUV-4x70b](https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T13:18:34.281350](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b/blob/main/results_2024-02-03T13-18-34.281350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2424611028294509,
"acc_stderr": 0.03031610890226359,
"acc_norm": 0.2428216424826148,
"acc_norm_stderr": 0.031122287295333263,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.48811709450476065,
"mc2_stderr": 0.016595901285138773
},
"harness|arc:challenge|25": {
"acc": 0.20136518771331058,
"acc_stderr": 0.011718927477444265,
"acc_norm": 0.24744027303754265,
"acc_norm_stderr": 0.01261035266329267
},
"harness|hellaswag|10": {
"acc": 0.26777534355706034,
"acc_stderr": 0.004418948941099406,
"acc_norm": 0.28291177056363276,
"acc_norm_stderr": 0.00449493402546234
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.15555555555555556,
"acc_stderr": 0.031309483648783144,
"acc_norm": 0.15555555555555556,
"acc_norm_stderr": 0.031309483648783144
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106758,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106758
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.042207736591714534,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.042207736591714534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2170212765957447,
"acc_stderr": 0.026947483121496228,
"acc_norm": 0.2170212765957447,
"acc_norm_stderr": 0.026947483121496228
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489358,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489358
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.296551724137931,
"acc_stderr": 0.03806142687309993,
"acc_norm": 0.296551724137931,
"acc_norm_stderr": 0.03806142687309993
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.02313528797432563,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.02313528797432563
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239973,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239973
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.0292255758924896,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.0292255758924896
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.035886248000917075,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.035886248000917075
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.032210245080411565,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.032210245080411565
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222724,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766124,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766124
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889197,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889197
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23178807947019867,
"acc_stderr": 0.034454062719870546,
"acc_norm": 0.23178807947019867,
"acc_norm_stderr": 0.034454062719870546
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708443,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708443
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.028963702570791044,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.028963702570791044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.21524663677130046,
"acc_stderr": 0.027584066602208263,
"acc_norm": 0.21524663677130046,
"acc_norm_stderr": 0.027584066602208263
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934722,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646033,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646033
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2515964240102171,
"acc_stderr": 0.015517322365529638,
"acc_norm": 0.2515964240102171,
"acc_norm_stderr": 0.015517322365529638
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0218552552634218,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0218552552634218
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982476,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266733,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266733
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24185136897001303,
"acc_stderr": 0.010936550813827056,
"acc_norm": 0.24185136897001303,
"acc_norm_stderr": 0.010936550813827056
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072773,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072773
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307748,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307748
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.03424042924691584,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.03424042924691584
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.48811709450476065,
"mc2_stderr": 0.016595901285138773
},
"harness|winogrande|5": {
"acc": 0.5248618784530387,
"acc_stderr": 0.01403510288362775
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b | [
"region:us"
] | 2024-02-03T13:20:57+00:00 | {"pretty_name": "Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/clown-SUV-4x70b](https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T13:18:34.281350](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__clown-SUV-4x70b/blob/main/results_2024-02-03T13-18-34.281350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2424611028294509,\n \"acc_stderr\": 0.03031610890226359,\n \"acc_norm\": 0.2428216424826148,\n \"acc_norm_stderr\": 0.031122287295333263,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.48811709450476065,\n \"mc2_stderr\": 0.016595901285138773\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20136518771331058,\n \"acc_stderr\": 0.011718927477444265,\n \"acc_norm\": 0.24744027303754265,\n \"acc_norm_stderr\": 0.01261035266329267\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26777534355706034,\n \"acc_stderr\": 0.004418948941099406,\n \"acc_norm\": 0.28291177056363276,\n \"acc_norm_stderr\": 0.00449493402546234\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.15555555555555556,\n \"acc_stderr\": 0.031309483648783144,\n \"acc_norm\": 0.15555555555555556,\n \"acc_norm_stderr\": 0.031309483648783144\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106758,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106758\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.042207736591714534,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.042207736591714534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2170212765957447,\n \"acc_stderr\": 0.026947483121496228,\n \"acc_norm\": 0.2170212765957447,\n \"acc_norm_stderr\": 0.026947483121496228\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489358,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489358\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.02313528797432563,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.02313528797432563\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239973,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239973\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.0292255758924896,\n \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.0292255758924896\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.035886248000917075,\n \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.035886248000917075\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.20707070707070707,\n \"acc_stderr\": 0.02886977846026705,\n \"acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.02886977846026705\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.032210245080411565,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.032210245080411565\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222724,\n \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222724\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766124,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766124\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889197,\n \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889197\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23178807947019867,\n \"acc_stderr\": 0.034454062719870546,\n \"acc_norm\": 0.23178807947019867,\n \"acc_norm_stderr\": 0.034454062719870546\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708443,\n \"acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708443\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.028963702570791044,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.028963702570791044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.21524663677130046,\n \"acc_stderr\": 0.027584066602208263,\n \"acc_norm\": 0.21524663677130046,\n \"acc_norm_stderr\": 0.027584066602208263\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934722,\n \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646033,\n \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646033\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.24786324786324787,\n \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2515964240102171,\n \"acc_stderr\": 0.015517322365529638,\n \"acc_norm\": 0.2515964240102171,\n \"acc_norm_stderr\": 0.015517322365529638\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.0218552552634218,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.0218552552634218\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982476,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982476\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266733,\n \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266733\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24185136897001303,\n \"acc_stderr\": 0.010936550813827056,\n \"acc_norm\": 0.24185136897001303,\n \"acc_norm_stderr\": 0.010936550813827056\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.04013964554072773,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.04013964554072773\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307748,\n \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307748\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.30845771144278605,\n \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.03424042924691584,\n \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.03424042924691584\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.48811709450476065,\n \"mc2_stderr\": 0.016595901285138773\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5248618784530387,\n \"acc_stderr\": 0.01403510288362775\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/NobodyExistsOnTheInternet/clown-SUV-4x70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["**/details_harness|winogrande|5_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T13-18-34.281350.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T13_18_34.281350", "path": ["results_2024-02-03T13-18-34.281350.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T13-18-34.281350.parquet"]}]}]} | 2024-02-03T13:21:19+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b
Dataset automatically created during the evaluation run of model NobodyExistsOnTheInternet/clown-SUV-4x70b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T13:18:34.281350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b\n\n\n\nDataset automatically created during the evaluation run of model NobodyExistsOnTheInternet/clown-SUV-4x70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:18:34.281350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/clown-SUV-4x70b\n\n\n\nDataset automatically created during the evaluation run of model NobodyExistsOnTheInternet/clown-SUV-4x70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:18:34.281350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
598d87096e96fe3434cb6b7ee7803454b93d2d06 | # Dataset Card for "cauhoiphapluat"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | phamtungthuy/cauhoiphapluat | [
"region:us"
] | 2024-02-03T13:22:19+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answer", "dtype": "string"}, {"name": "field", "dtype": "string"}, {"name": "time", "dtype": "string"}, {"name": "relevant", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 56126262, "num_examples": 21820}, {"name": "train", "num_bytes": 1051367840, "num_examples": 414852}], "download_size": 403446647, "dataset_size": 1107494102}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}]}]} | 2024-02-03T13:24:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cauhoiphapluat"
More Information needed | [
"# Dataset Card for \"cauhoiphapluat\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cauhoiphapluat\"\n\nMore Information needed"
] |
55b5665038f93ac5027ca5d6a9749954b13e40ea |
# Dataset Card for Evaluation run of NeverSleep/MiquMaid-v1-70B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__MiquMaid-v1-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T13:55:45.160201](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__MiquMaid-v1-70B/blob/main/results_2024-02-03T13-55-45.160201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.747644956445168,
"acc_stderr": 0.028643244664619514,
"acc_norm": 0.7506821937250182,
"acc_norm_stderr": 0.029197302210032687,
"mc1": 0.45165238678090575,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6178734796953549,
"mc2_stderr": 0.014878715039713398
},
"harness|arc:challenge|25": {
"acc": 0.674061433447099,
"acc_stderr": 0.013697432466693244,
"acc_norm": 0.7167235494880546,
"acc_norm_stderr": 0.013167478735134576
},
"harness|hellaswag|10": {
"acc": 0.694582752439753,
"acc_stderr": 0.004596426220000888,
"acc_norm": 0.8796056562437762,
"acc_norm_stderr": 0.003247570330456916
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6962962962962963,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.6962962962962963,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.029162631596843982,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.029162631596843982
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.02554523921025691,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.02554523921025691
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.033450369167889904,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.033450369167889904
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7531914893617021,
"acc_stderr": 0.02818544130123409,
"acc_norm": 0.7531914893617021,
"acc_norm_stderr": 0.02818544130123409
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.025670080636909308,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.025670080636909308
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.867741935483871,
"acc_stderr": 0.019272015434846475,
"acc_norm": 0.867741935483871,
"acc_norm_stderr": 0.019272015434846475
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.033959703819985754,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.033959703819985754
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284332,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284332
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.019960225563172885,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.019960225563172885
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360756,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7974358974358975,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.7974358974358975,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.029869605095316904,
"acc_norm": 0.4,
"acc_norm_stderr": 0.029869605095316904
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8739495798319328,
"acc_stderr": 0.02155962312121392,
"acc_norm": 0.8739495798319328,
"acc_norm_stderr": 0.02155962312121392
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016567,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016567
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6898148148148148,
"acc_stderr": 0.031546962856566295,
"acc_norm": 0.6898148148148148,
"acc_norm_stderr": 0.031546962856566295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407256,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407256
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.02728524631275895,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.02728524631275895
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9074074074074074,
"acc_stderr": 0.02802188803860944,
"acc_norm": 0.9074074074074074,
"acc_norm_stderr": 0.02802188803860944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971723,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.01653462768431136,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.01653462768431136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.89272030651341,
"acc_stderr": 0.01106657144950843,
"acc_norm": 0.89272030651341,
"acc_norm_stderr": 0.01106657144950843
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6290502793296089,
"acc_stderr": 0.016155910721341777,
"acc_norm": 0.6290502793296089,
"acc_norm_stderr": 0.016155910721341777
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.021828596053108416,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.021828596053108416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.02151405158597041,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.02151405158597041
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149893,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149893
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.02955545423677884,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.02955545423677884
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5814863102998696,
"acc_stderr": 0.012599505608336482,
"acc_norm": 0.5814863102998696,
"acc_norm_stderr": 0.012599505608336482
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.023886881922440335,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.023886881922440335
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.015750526284363346,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.015750526284363346
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9253731343283582,
"acc_stderr": 0.018581939698490618,
"acc_norm": 0.9253731343283582,
"acc_norm_stderr": 0.018581939698490618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.94,
"acc_stderr": 0.02386832565759418,
"acc_norm": 0.94,
"acc_norm_stderr": 0.02386832565759418
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45165238678090575,
"mc1_stderr": 0.017421480300277643,
"mc2": 0.6178734796953549,
"mc2_stderr": 0.014878715039713398
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627328
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131719
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NeverSleep__MiquMaid-v1-70B | [
"region:us"
] | 2024-02-03T13:58:09+00:00 | {"pretty_name": "Evaluation run of NeverSleep/MiquMaid-v1-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [NeverSleep/MiquMaid-v1-70B](https://huggingface.co/NeverSleep/MiquMaid-v1-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__MiquMaid-v1-70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-02-03T13:55:45.160201](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__MiquMaid-v1-70B/blob/main/results_2024-02-03T13-55-45.160201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.747644956445168,\n \"acc_stderr\": 0.028643244664619514,\n \"acc_norm\": 0.7506821937250182,\n \"acc_norm_stderr\": 0.029197302210032687,\n \"mc1\": 0.45165238678090575,\n \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6178734796953549,\n \"mc2_stderr\": 0.014878715039713398\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.674061433447099,\n \"acc_stderr\": 0.013697432466693244,\n \"acc_norm\": 0.7167235494880546,\n \"acc_norm_stderr\": 0.013167478735134576\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.694582752439753,\n \"acc_stderr\": 0.004596426220000888,\n \"acc_norm\": 0.8796056562437762,\n \"acc_norm_stderr\": 0.003247570330456916\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6962962962962963,\n \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.6962962962962963,\n \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.029162631596843982,\n \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.029162631596843982\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.02554523921025691,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.02554523921025691\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.033450369167889904,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.033450369167889904\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7531914893617021,\n \"acc_stderr\": 0.02818544130123409,\n \"acc_norm\": 0.7531914893617021,\n \"acc_norm_stderr\": 0.02818544130123409\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5396825396825397,\n \"acc_stderr\": 0.025670080636909308,\n \"acc_norm\": 0.5396825396825397,\n \"acc_norm_stderr\": 0.025670080636909308\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.867741935483871,\n \"acc_stderr\": 0.019272015434846475,\n \"acc_norm\": 0.867741935483871,\n \"acc_norm_stderr\": 0.019272015434846475\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.033959703819985754,\n \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.033959703819985754\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9141414141414141,\n \"acc_stderr\": 0.019960225563172885,\n \"acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.019960225563172885\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360756,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360756\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7974358974358975,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.7974358974358975,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.029869605095316904,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.029869605095316904\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8739495798319328,\n \"acc_stderr\": 0.02155962312121392,\n \"acc_norm\": 0.8739495798319328,\n \"acc_norm_stderr\": 0.02155962312121392\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016567,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016567\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.031546962856566295,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.031546962856566295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407256,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407256\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.02728524631275895,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.02728524631275895\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.02802188803860944,\n \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.02802188803860944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971723,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.01653462768431136,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.01653462768431136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.89272030651341,\n \"acc_stderr\": 0.01106657144950843,\n \"acc_norm\": 0.89272030651341,\n \"acc_norm_stderr\": 0.01106657144950843\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6290502793296089,\n \"acc_stderr\": 0.016155910721341777,\n \"acc_norm\": 0.6290502793296089,\n \"acc_norm_stderr\": 0.016155910721341777\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.021828596053108416,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.021828596053108416\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n \"acc_stderr\": 0.02151405158597041,\n \"acc_norm\": 0.8263665594855305,\n \"acc_norm_stderr\": 0.02151405158597041\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149893,\n \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149893\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5673758865248227,\n \"acc_stderr\": 0.02955545423677884,\n \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.02955545423677884\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5814863102998696,\n \"acc_stderr\": 0.012599505608336482,\n \"acc_norm\": 0.5814863102998696,\n \"acc_norm_stderr\": 0.012599505608336482\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.023886881922440335,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.023886881922440335\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.015750526284363346,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.015750526284363346\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9253731343283582,\n \"acc_stderr\": 0.018581939698490618,\n \"acc_norm\": 0.9253731343283582,\n \"acc_norm_stderr\": 0.018581939698490618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.94,\n \"acc_stderr\": 0.02386832565759418,\n \"acc_norm\": 0.94,\n \"acc_norm_stderr\": 0.02386832565759418\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45165238678090575,\n \"mc1_stderr\": 0.017421480300277643,\n \"mc2\": 0.6178734796953549,\n \"mc2_stderr\": 0.014878715039713398\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627328\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \"acc_stderr\": 0.012705685723131719\n }\n}\n```", "repo_url": "https://huggingface.co/NeverSleep/MiquMaid-v1-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-management|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-virology|5_2024-02-03T13-55-45.160201.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["**/details_harness|winogrande|5_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-02-03T13-55-45.160201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_02_03T13_55_45.160201", "path": ["results_2024-02-03T13-55-45.160201.parquet"]}, {"split": "latest", "path": ["results_2024-02-03T13-55-45.160201.parquet"]}]}]} | 2024-02-03T13:58:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NeverSleep/MiquMaid-v1-70B
Dataset automatically created during the evaluation run of model NeverSleep/MiquMaid-v1-70B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-02-03T13:55:45.160201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NeverSleep/MiquMaid-v1-70B\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/MiquMaid-v1-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:55:45.160201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NeverSleep/MiquMaid-v1-70B\n\n\n\nDataset automatically created during the evaluation run of model NeverSleep/MiquMaid-v1-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-02-03T13:55:45.160201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
691230c0699ff8a9a37d45e607051cb3f9408568 |
[THUDM/webglm-qa](https://huggingface.co/datasets/THUDM/webglm-qa) in ChatML format.
Python code used for conversion:
```python
from datasets import load_dataset
import pandas
import re
import random
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained(
pretrained_model_name_or_path="Felladrin/Llama-160M-Chat-v1"
)
dataset = load_dataset("THUDM/webglm-qa", split="train")
def format(columns):
references = "\n".join(
[
f"- {columns['references'][i].strip()}"
for i in range(len(columns["references"]))
]
)
question = columns["question"].strip()
answer = columns["answer"].strip()
assistant_message = re.sub(r"\[\d\]", "", answer)
if random.random() < 0.5:
user_message = f"Question:\n{question}\n\nContext:\n{references}"
else:
user_message = f"Context:\n{references}\n\nQuestion:\n{question}"
messages = [
{
"role": "user",
"content": user_message,
},
{
"role": "assistant",
"content": assistant_message,
},
]
return tokenizer.apply_chat_template(messages, tokenize=False)
pandas.DataFrame({"text": [format(columns) for columns in dataset]}).to_parquet("train.parquet", index=False)
``` | Felladrin/ChatML-WebGLM-QA | [
"task_categories:question-answering",
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-02-03T14:07:09+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["question-answering", "text-generation"]} | 2024-02-03T14:34:32+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us
|
THUDM/webglm-qa in ChatML format.
Python code used for conversion:
| [] | [
"TAGS\n#task_categories-question-answering #task_categories-text-generation #size_categories-10K<n<100K #language-English #license-apache-2.0 #region-us \n"
] |
8aac5a413b4db083ea7607d92a0020a55281d353 | ## Setup
- Prepare virtual environments
```
$ python3 -m venv venv
```
- Activate virtual environments
```
$ source venv/bin/activate
```
- Install libraries
```
$ pip install -v -e .
```
| kazuyaseki/me | [
"region:us"
] | 2024-02-03T14:08:06+00:00 | {} | 2024-02-10T09:51:14+00:00 | [] | [] | TAGS
#region-us
| ## Setup
- Prepare virtual environments
- Activate virtual environments
- Install libraries
| [
"## Setup\n\n- Prepare virtual environments\n\n\n\n- Activate virtual environments\n\n\n\n- Install libraries"
] | [
"TAGS\n#region-us \n",
"## Setup\n\n- Prepare virtual environments\n\n\n\n- Activate virtual environments\n\n\n\n- Install libraries"
] |
c8e567c55d72591b7421b661005066ed647c01a1 |
math_QAaugP dataset is a combination of [MetaMathQA](https://huggingface.co/datasets/meta-math/MetaMathQA), [MathInstruct](https://huggingface.co/datasets/TIGER-Lab/MathInstruct), and some internal data.
We use [Arithmo](https://huggingface.co/datasets/akjindal53244/Arithmo-Data) dataset for the combination of MetaMathQA and MathInstruct.
| adityasihag/math_QAaugP | [
"license:mit",
"region:us"
] | 2024-02-03T14:20:17+00:00 | {"license": "mit"} | 2024-02-11T14:38:42+00:00 | [] | [] | TAGS
#license-mit #region-us
|
math_QAaugP dataset is a combination of MetaMathQA, MathInstruct, and some internal data.
We use Arithmo dataset for the combination of MetaMathQA and MathInstruct.
| [] | [
"TAGS\n#license-mit #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.