sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
| tokens_length
sequencelengths 1
353
| input_texts
sequencelengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
633010a2dd6e4e3ff76d9e16a8517c210fff38b9 |
# Dataset Card for Evaluation run of ewqr2130/mistral-moe-scratch
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/mistral-moe-scratch](https://huggingface.co/ewqr2130/mistral-moe-scratch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__mistral-moe-scratch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T05:44:38.553210](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-moe-scratch/blob/main/results_2024-01-05T05-44-38.553210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__mistral-moe-scratch | [
"region:us"
] | 2024-01-05T05:46:53+00:00 | {"pretty_name": "Evaluation run of ewqr2130/mistral-moe-scratch", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/mistral-moe-scratch](https://huggingface.co/ewqr2130/mistral-moe-scratch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__mistral-moe-scratch\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T05:44:38.553210](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__mistral-moe-scratch/blob/main/results_2024-01-05T05-44-38.553210.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/mistral-moe-scratch", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|arc:challenge|25_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|gsm8k|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hellaswag|10_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T05-44-38.553210.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["**/details_harness|winogrande|5_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T05-44-38.553210.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T05_44_38.553210", "path": ["results_2024-01-05T05-44-38.553210.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T05-44-38.553210.parquet"]}]}]} | 2024-01-05T05:47:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/mistral-moe-scratch
Dataset automatically created during the evaluation run of model ewqr2130/mistral-moe-scratch on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T05:44:38.553210(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/mistral-moe-scratch\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/mistral-moe-scratch on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T05:44:38.553210(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/mistral-moe-scratch\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/mistral-moe-scratch on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T05:44:38.553210(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ewqr2130/mistral-moe-scratch\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/mistral-moe-scratch on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T05:44:38.553210(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b8222d3152eeb921a993195f6e26de79be0093f3 |
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T06:31:48.797611](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter1/blob/main/results_2024-01-05T06-31-48.797611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6083070915056247,
"acc_stderr": 0.032946994490981554,
"acc_norm": 0.6144414847119565,
"acc_norm_stderr": 0.03362118464961407,
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5738893811625738,
"mc2_stderr": 0.015990080392547533
},
"harness|arc:challenge|25": {
"acc": 0.6168941979522184,
"acc_stderr": 0.014206472661672876,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497728
},
"harness|hellaswag|10": {
"acc": 0.6759609639514041,
"acc_stderr": 0.004670581884781161,
"acc_norm": 0.8544114718183629,
"acc_norm_stderr": 0.003519724163310889
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7290322580645161,
"acc_stderr": 0.025284416114900156,
"acc_norm": 0.7290322580645161,
"acc_norm_stderr": 0.025284416114900156
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7926605504587156,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.7926605504587156,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.02786594228663933,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.02786594228663933
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35307262569832404,
"acc_stderr": 0.015984204545268565,
"acc_norm": 0.35307262569832404,
"acc_norm_stderr": 0.015984204545268565
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6635802469135802,
"acc_stderr": 0.02628973494595293,
"acc_norm": 0.6635802469135802,
"acc_norm_stderr": 0.02628973494595293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4574468085106383,
"acc_stderr": 0.029719281272236837,
"acc_norm": 0.4574468085106383,
"acc_norm_stderr": 0.029719281272236837
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.012700582404768223,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.012700582404768223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.408812729498164,
"mc1_stderr": 0.01720995215164173,
"mc2": 0.5738893811625738,
"mc2_stderr": 0.015990080392547533
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.30856709628506446,
"acc_stderr": 0.012723076049815884
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-spin-iter1 | [
"region:us"
] | 2024-01-05T05:47:56+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1](https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T06:31:48.797611](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__zephyr-7b-sft-full-SPIN-iter1/blob/main/results_2024-01-05T06-31-48.797611.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6083070915056247,\n \"acc_stderr\": 0.032946994490981554,\n \"acc_norm\": 0.6144414847119565,\n \"acc_norm_stderr\": 0.03362118464961407,\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5738893811625738,\n \"mc2_stderr\": 0.015990080392547533\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672876,\n \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497728\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6759609639514041,\n \"acc_stderr\": 0.004670581884781161,\n \"acc_norm\": 0.8544114718183629,\n \"acc_norm_stderr\": 0.003519724163310889\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7290322580645161,\n \"acc_stderr\": 0.025284416114900156,\n \"acc_norm\": 0.7290322580645161,\n \"acc_norm_stderr\": 0.025284416114900156\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7926605504587156,\n \"acc_stderr\": 0.017381415563608674,\n \"acc_norm\": 0.7926605504587156,\n \"acc_norm_stderr\": 0.017381415563608674\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.02786594228663933,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.02786594228663933\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35307262569832404,\n \"acc_stderr\": 0.015984204545268565,\n \"acc_norm\": 0.35307262569832404,\n \"acc_norm_stderr\": 0.015984204545268565\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6635802469135802,\n \"acc_stderr\": 0.02628973494595293,\n \"acc_norm\": 0.6635802469135802,\n \"acc_norm_stderr\": 0.02628973494595293\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4574468085106383,\n \"acc_stderr\": 0.029719281272236837,\n \"acc_norm\": 0.4574468085106383,\n \"acc_norm_stderr\": 0.029719281272236837\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.012700582404768223,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.012700582404768223\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.408812729498164,\n \"mc1_stderr\": 0.01720995215164173,\n \"mc2\": 0.5738893811625738,\n \"mc2_stderr\": 0.015990080392547533\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30856709628506446,\n \"acc_stderr\": 0.012723076049815884\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|arc:challenge|25_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|arc:challenge|25_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|gsm8k|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|gsm8k|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hellaswag|10_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hellaswag|10_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T05-45-35.591483.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T06-31-48.797611.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["**/details_harness|winogrande|5_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["**/details_harness|winogrande|5_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T06-31-48.797611.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T05_45_35.591483", "path": ["results_2024-01-05T05-45-35.591483.parquet"]}, {"split": "2024_01_05T06_31_48.797611", "path": ["results_2024-01-05T06-31-48.797611.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T06-31-48.797611.parquet"]}]}]} | 2024-01-05T06:34:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1
Dataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T06:31:48.797611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T06:31:48.797611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T06:31:48.797611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/zephyr-7b-sft-full-SPIN-iter1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T06:31:48.797611(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
cad65486fe413082cbef0b067dadfdfcea924fb8 | # Dataset Card for "uf_no_to_questions_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | yimingzhang/uf_no_to_questions_v2 | [
"region:us"
] | 2024-01-05T06:14:18+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_prefs", "path": "data/train_prefs-*"}, {"split": "test_prefs", "path": "data/test_prefs-*"}]}], "dataset_info": {"features": [{"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_prefs", "num_bytes": 191388931, "num_examples": 61966}, {"name": "test_prefs", "num_bytes": 6168642, "num_examples": 2000}], "download_size": 108884489, "dataset_size": 197557573}} | 2024-01-05T06:14:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "uf_no_to_questions_v2"
More Information needed | [
"# Dataset Card for \"uf_no_to_questions_v2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"uf_no_to_questions_v2\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"uf_no_to_questions_v2\"\n\nMore Information needed"
] |
8888211c429fdafa573f69b56c5730c677e05061 | This data set has been taken from lamini/lamini_docs dataset and has been formatted to provide training data set in the llama2 Q&A format
| dpelluru/llama2_lamini_docs_dataset | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-05T06:24:17+00:00 | {"license": "cc-by-4.0"} | 2024-01-05T06:27:29+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
| This data set has been taken from lamini/lamini_docs dataset and has been formatted to provide training data set in the llama2 Q&A format
| [] | [
"TAGS\n#license-cc-by-4.0 #region-us \n"
] | [
15
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n"
] |
991bba463730c80f40722982249bab44e8c616dc |
Sources:
https://www.niod.nl/en/collections/image-bank-ww2 | GDavila/ww2-photos | [
"region:us"
] | 2024-01-05T06:47:23+00:00 | {} | 2024-01-05T06:47:40+00:00 | [] | [] | TAGS
#region-us
|
Sources:
URL | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
6fd7cb48a51aaa15dda51ab07d1f6e08711e6995 |
# Dataset Card for Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B](https://huggingface.co/Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Nous-Hermes-2-SUS-Chat-2x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T07:39:39.668066](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Nous-Hermes-2-SUS-Chat-2x34B/blob/main/results_2024-01-05T07-39-39.668066.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7631944050711235,
"acc_stderr": 0.02806486884476764,
"acc_norm": 0.7662955144547114,
"acc_norm_stderr": 0.028608371446490685,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5741628635210513,
"mc2_stderr": 0.01486973924236114
},
"harness|arc:challenge|25": {
"acc": 0.6382252559726962,
"acc_stderr": 0.014041957945038076,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880534
},
"harness|hellaswag|10": {
"acc": 0.657239593706433,
"acc_stderr": 0.004736621698861176,
"acc_norm": 0.8522206731726748,
"acc_norm_stderr": 0.003541558263779117
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.0374985070917402,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.0374985070917402
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891366,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891366
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.03456425745086999,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.03456425745086999
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6798941798941799,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.6798941798941799,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8935483870967742,
"acc_stderr": 0.017545102951656635,
"acc_norm": 0.8935483870967742,
"acc_norm_stderr": 0.017545102951656635
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656194,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656194
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8,
"acc_stderr": 0.020280805062535726,
"acc_norm": 0.8,
"acc_norm_stderr": 0.020280805062535726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.48518518518518516,
"acc_stderr": 0.0304721532493286,
"acc_norm": 0.48518518518518516,
"acc_norm_stderr": 0.0304721532493286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.022730208119306535,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.022730208119306535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9229357798165138,
"acc_stderr": 0.011434381698911096,
"acc_norm": 0.9229357798165138,
"acc_norm_stderr": 0.011434381698911096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865387,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865387
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.02624319405407388,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.02624319405407388
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665168,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665168
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553855,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553855
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.9320388349514563,
"acc_stderr": 0.02491995914251447,
"acc_norm": 0.9320388349514563,
"acc_norm_stderr": 0.02491995914251447
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.01653462768431136,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.01653462768431136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193068,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193068
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135033,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135033
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.641340782122905,
"acc_stderr": 0.016040454426164474,
"acc_norm": 0.641340782122905,
"acc_norm_stderr": 0.016040454426164474
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8594771241830066,
"acc_stderr": 0.019899435463539946,
"acc_norm": 0.8594771241830066,
"acc_norm_stderr": 0.019899435463539946
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.02119387252803497,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.02119387252803497
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8919753086419753,
"acc_stderr": 0.017271763084483527,
"acc_norm": 0.8919753086419753,
"acc_norm_stderr": 0.017271763084483527
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.02866382014719949,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.02866382014719949
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6016949152542372,
"acc_stderr": 0.012503310565166233,
"acc_norm": 0.6016949152542372,
"acc_norm_stderr": 0.012503310565166233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.01530932926696914,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.01530932926696914
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5741628635210513,
"mc2_stderr": 0.01486973924236114
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343324
},
"harness|gsm8k|5": {
"acc": 0.7308567096285065,
"acc_stderr": 0.012216595457292733
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Nous-Hermes-2-SUS-Chat-2x34B | [
"region:us"
] | 2024-01-05T07:41:52+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B](https://huggingface.co/Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Nous-Hermes-2-SUS-Chat-2x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T07:39:39.668066](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Nous-Hermes-2-SUS-Chat-2x34B/blob/main/results_2024-01-05T07-39-39.668066.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7631944050711235,\n \"acc_stderr\": 0.02806486884476764,\n \"acc_norm\": 0.7662955144547114,\n \"acc_norm_stderr\": 0.028608371446490685,\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5741628635210513,\n \"mc2_stderr\": 0.01486973924236114\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038076,\n \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880534\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n \"acc_stderr\": 0.004736621698861176,\n \"acc_norm\": 0.8522206731726748,\n \"acc_norm_stderr\": 0.003541558263779117\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n \"acc_stderr\": 0.0374985070917402,\n \"acc_norm\": 0.7481481481481481,\n \"acc_norm_stderr\": 0.0374985070917402\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891366,\n \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891366\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.03456425745086999,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.03456425745086999\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6798941798941799,\n \"acc_stderr\": 0.024026846392873506,\n \"acc_norm\": 0.6798941798941799,\n \"acc_norm_stderr\": 0.024026846392873506\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8935483870967742,\n \"acc_stderr\": 0.017545102951656635,\n \"acc_norm\": 0.8935483870967742,\n \"acc_norm_stderr\": 0.017545102951656635\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656194,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656194\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.020280805062535726,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.020280805062535726\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.48518518518518516,\n \"acc_stderr\": 0.0304721532493286,\n \"acc_norm\": 0.48518518518518516,\n \"acc_norm_stderr\": 0.0304721532493286\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.022730208119306535,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.022730208119306535\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9229357798165138,\n \"acc_stderr\": 0.011434381698911096,\n \"acc_norm\": 0.9229357798165138,\n \"acc_norm_stderr\": 0.011434381698911096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865387,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865387\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9090909090909091,\n \"acc_stderr\": 0.02624319405407388,\n \"acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.02624319405407388\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.03038159675665168,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.03038159675665168\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553855,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553855\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9320388349514563,\n \"acc_stderr\": 0.02491995914251447,\n \"acc_norm\": 0.9320388349514563,\n \"acc_norm_stderr\": 0.02491995914251447\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.01653462768431136,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.01653462768431136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.010461015338193068,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.010461015338193068\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135033,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135033\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.641340782122905,\n \"acc_stderr\": 0.016040454426164474,\n \"acc_norm\": 0.641340782122905,\n \"acc_norm_stderr\": 0.016040454426164474\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8594771241830066,\n \"acc_stderr\": 0.019899435463539946,\n \"acc_norm\": 0.8594771241830066,\n \"acc_norm_stderr\": 0.019899435463539946\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.02119387252803497,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.02119387252803497\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8919753086419753,\n \"acc_stderr\": 0.017271763084483527,\n \"acc_norm\": 0.8919753086419753,\n \"acc_norm_stderr\": 0.017271763084483527\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.02866382014719949,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.02866382014719949\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6016949152542372,\n \"acc_stderr\": 0.012503310565166233,\n \"acc_norm\": 0.6016949152542372,\n \"acc_norm_stderr\": 0.012503310565166233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.01530932926696914,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.01530932926696914\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5741628635210513,\n \"mc2_stderr\": 0.01486973924236114\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7308567096285065,\n \"acc_stderr\": 0.012216595457292733\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-39-39.668066.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["**/details_harness|winogrande|5_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T07-39-39.668066.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T07_39_39.668066", "path": ["results_2024-01-05T07-39-39.668066.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T07-39-39.668066.parquet"]}]}]} | 2024-01-05T07:42:16+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B
Dataset automatically created during the evaluation run of model Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T07:39:39.668066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:39:39.668066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:39:39.668066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Nous-Hermes-2-SUS-Chat-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T07:39:39.668066(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
bdfb16f87b552760e4388ad9c9ee2a559f54e39b |
# More Information Needed
E-mail: [email protected] | TomokiFujihara/supplement_for_gray_area_comment | [
"license:apache-2.0",
"region:us"
] | 2024-01-05T07:42:32+00:00 | {"license": "apache-2.0"} | 2024-01-30T05:31:10+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
#
E-mail: tomoki.fujihara.p3@URL | [
"# \nE-mail: tomoki.fujihara.p3@URL"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# \nE-mail: tomoki.fujihara.p3@URL"
] | [
14,
16
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# \nE-mail: tomoki.fujihara.p3@URL"
] |
5bf707917bf56935b1ebd82c042f5c241d3d77c3 |
# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [decem/Dionysus-Mistral-m3-v5](https://huggingface.co/decem/Dionysus-Mistral-m3-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T07:41:42.571559](https://huggingface.co/datasets/open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5/blob/main/results_2024-01-05T07-41-42.571559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6119654723808889,
"acc_stderr": 0.03286363978834633,
"acc_norm": 0.6149242684593406,
"acc_norm_stderr": 0.03352054924433844,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5093204017955075,
"mc2_stderr": 0.015839968447220742
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186043,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.01434203648343618
},
"harness|hellaswag|10": {
"acc": 0.6283608842859988,
"acc_stderr": 0.004822550638450897,
"acc_norm": 0.8098984266082454,
"acc_norm_stderr": 0.003915792315457797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094753,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073403,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073403
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636864,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636864
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531015,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531015
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360276,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.02699254433929724,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.02699254433929724
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547228,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.019506291693954854,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.019506291693954854
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547738,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547738
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233257,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233257
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.0167113581635444,
"mc2": 0.5093204017955075,
"mc2_stderr": 0.015839968447220742
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403107
},
"harness|gsm8k|5": {
"acc": 0.510235026535254,
"acc_stderr": 0.013769598923012395
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5 | [
"region:us"
] | 2024-01-05T07:44:01+00:00 | {"pretty_name": "Evaluation run of decem/Dionysus-Mistral-m3-v5", "dataset_summary": "Dataset automatically created during the evaluation run of model [decem/Dionysus-Mistral-m3-v5](https://huggingface.co/decem/Dionysus-Mistral-m3-v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T07:41:42.571559](https://huggingface.co/datasets/open-llm-leaderboard/details_decem__Dionysus-Mistral-m3-v5/blob/main/results_2024-01-05T07-41-42.571559.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6119654723808889,\n \"acc_stderr\": 0.03286363978834633,\n \"acc_norm\": 0.6149242684593406,\n \"acc_norm_stderr\": 0.03352054924433844,\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5093204017955075,\n \"mc2_stderr\": 0.015839968447220742\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186043,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.01434203648343618\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6283608842859988,\n \"acc_stderr\": 0.004822550638450897,\n \"acc_norm\": 0.8098984266082454,\n \"acc_norm_stderr\": 0.003915792315457797\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094753,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.031357095996135904,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.031357095996135904\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073403,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073403\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636864,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636864\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531015,\n \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531015\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n \"acc_stderr\": 0.015581008080360276,\n \"acc_norm\": 0.31843575418994413,\n \"acc_norm_stderr\": 0.015581008080360276\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.02699254433929724,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.02699254433929724\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n \"acc_stderr\": 0.012647695889547228,\n \"acc_norm\": 0.43089960886571055,\n \"acc_norm_stderr\": 0.012647695889547228\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.019506291693954854,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.019506291693954854\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547738,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547738\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233257,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233257\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n \"mc1_stderr\": 0.0167113581635444,\n \"mc2\": 0.5093204017955075,\n \"mc2_stderr\": 0.015839968447220742\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403107\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.510235026535254,\n \"acc_stderr\": 0.013769598923012395\n }\n}\n```", "repo_url": "https://huggingface.co/decem/Dionysus-Mistral-m3-v5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["**/details_harness|winogrande|5_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T07-41-42.571559.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T07_41_42.571559", "path": ["results_2024-01-05T07-41-42.571559.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T07-41-42.571559.parquet"]}]}]} | 2024-01-05T07:44:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5
Dataset automatically created during the evaluation run of model decem/Dionysus-Mistral-m3-v5 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T07:41:42.571559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5\n\n\n\nDataset automatically created during the evaluation run of model decem/Dionysus-Mistral-m3-v5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:41:42.571559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5\n\n\n\nDataset automatically created during the evaluation run of model decem/Dionysus-Mistral-m3-v5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:41:42.571559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of decem/Dionysus-Mistral-m3-v5\n\n\n\nDataset automatically created during the evaluation run of model decem/Dionysus-Mistral-m3-v5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T07:41:42.571559(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
f7aae409e0e122ed37020d894884ada7ed5d2172 |
# Dataset Card for Evaluation run of qblocks/gpt2_137m_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qblocks/gpt2_137m_DolphinCoder](https://huggingface.co/qblocks/gpt2_137m_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__gpt2_137m_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T07:48:29.644069](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__gpt2_137m_DolphinCoder/blob/main/results_2024-01-05T07-48-29.644069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2541058154915133,
"acc_stderr": 0.030552087768393632,
"acc_norm": 0.2544491269494238,
"acc_norm_stderr": 0.03131084218129606,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.41575126598869544,
"mc2_stderr": 0.015079894627974334
},
"harness|arc:challenge|25": {
"acc": 0.19795221843003413,
"acc_stderr": 0.011643990971573395,
"acc_norm": 0.21843003412969283,
"acc_norm_stderr": 0.012074291605700983
},
"harness|hellaswag|10": {
"acc": 0.29117705636327423,
"acc_stderr": 0.00453376468621199,
"acc_norm": 0.3134833698466441,
"acc_norm_stderr": 0.004629608863272312
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.038850042458002554,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.038850042458002554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106737,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106737
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.034370793441061344,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.034370793441061344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217893,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1349206349206349,
"acc_stderr": 0.030557101589417508,
"acc_norm": 0.1349206349206349,
"acc_norm_stderr": 0.030557101589417508
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2129032258064516,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.2129032258064516,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.36787564766839376,
"acc_stderr": 0.03480175668466036,
"acc_norm": 0.36787564766839376,
"acc_norm_stderr": 0.03480175668466036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.02110773012724399,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.02110773012724399
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.02788682807838056,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.02788682807838056
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2119205298013245,
"acc_stderr": 0.033367670865679766,
"acc_norm": 0.2119205298013245,
"acc_norm_stderr": 0.033367670865679766
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3467889908256881,
"acc_stderr": 0.020406097104093027,
"acc_norm": 0.3467889908256881,
"acc_norm_stderr": 0.020406097104093027
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224633,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224633
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794089,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794089
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.03635209121577806,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.03635209121577806
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041692,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041692
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274949,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23371647509578544,
"acc_stderr": 0.015133383278988844,
"acc_norm": 0.23371647509578544,
"acc_norm_stderr": 0.015133383278988844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321628,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859924,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.19935691318327975,
"acc_stderr": 0.022691033780549656,
"acc_norm": 0.19935691318327975,
"acc_norm_stderr": 0.022691033780549656
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543343,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543343
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25358539765319427,
"acc_stderr": 0.011111715336101143,
"acc_norm": 0.25358539765319427,
"acc_norm_stderr": 0.011111715336101143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.01781267654232065,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.01781267654232065
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3673469387755102,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.3673469387755102,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.208955223880597,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.208955223880597,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023496,
"mc2": 0.41575126598869544,
"mc2_stderr": 0.015079894627974334
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.01404109666434433
},
"harness|gsm8k|5": {
"acc": 0.01061410159211524,
"acc_stderr": 0.002822713322387704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qblocks__gpt2_137m_DolphinCoder | [
"region:us"
] | 2024-01-05T07:49:49+00:00 | {"pretty_name": "Evaluation run of qblocks/gpt2_137m_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [qblocks/gpt2_137m_DolphinCoder](https://huggingface.co/qblocks/gpt2_137m_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__gpt2_137m_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T07:48:29.644069](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__gpt2_137m_DolphinCoder/blob/main/results_2024-01-05T07-48-29.644069.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2541058154915133,\n \"acc_stderr\": 0.030552087768393632,\n \"acc_norm\": 0.2544491269494238,\n \"acc_norm_stderr\": 0.03131084218129606,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.41575126598869544,\n \"mc2_stderr\": 0.015079894627974334\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19795221843003413,\n \"acc_stderr\": 0.011643990971573395,\n \"acc_norm\": 0.21843003412969283,\n \"acc_norm_stderr\": 0.012074291605700983\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29117705636327423,\n \"acc_stderr\": 0.00453376468621199,\n \"acc_norm\": 0.3134833698466441,\n \"acc_norm_stderr\": 0.004629608863272312\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.038850042458002554,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.038850042458002554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106737,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106737\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n \"acc_stderr\": 0.034370793441061344,\n \"acc_norm\": 0.2152777777777778,\n \"acc_norm_stderr\": 0.034370793441061344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217893,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217893\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1349206349206349,\n \"acc_stderr\": 0.030557101589417508,\n \"acc_norm\": 0.1349206349206349,\n \"acc_norm_stderr\": 0.030557101589417508\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2129032258064516,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.2129032258064516,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.02110773012724399,\n \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.02110773012724399\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838056,\n \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838056\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2119205298013245,\n \"acc_stderr\": 0.033367670865679766,\n \"acc_norm\": 0.2119205298013245,\n \"acc_norm_stderr\": 0.033367670865679766\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3467889908256881,\n \"acc_stderr\": 0.020406097104093027,\n \"acc_norm\": 0.3467889908256881,\n \"acc_norm_stderr\": 0.020406097104093027\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n \"acc_stderr\": 0.029105220833224633,\n \"acc_norm\": 0.25112107623318386,\n \"acc_norm_stderr\": 0.029105220833224633\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794089,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794089\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n \"acc_stderr\": 0.03635209121577806,\n \"acc_norm\": 0.17857142857142858,\n \"acc_norm_stderr\": 0.03635209121577806\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041692,\n \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041692\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.02891120880274949,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.02891120880274949\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23371647509578544,\n \"acc_stderr\": 0.015133383278988844,\n \"acc_norm\": 0.23371647509578544,\n \"acc_norm_stderr\": 0.015133383278988844\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321628,\n \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321628\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.014465893829859924,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.014465893829859924\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.19935691318327975,\n \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.19935691318327975,\n \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543343,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543343\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25358539765319427,\n \"acc_stderr\": 0.011111715336101143,\n \"acc_norm\": 0.25358539765319427,\n \"acc_norm_stderr\": 0.011111715336101143\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2630718954248366,\n \"acc_stderr\": 0.01781267654232065,\n \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.01781267654232065\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3673469387755102,\n \"acc_stderr\": 0.030862144921087558,\n \"acc_norm\": 0.3673469387755102,\n \"acc_norm_stderr\": 0.030862144921087558\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.208955223880597,\n \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.208955223880597,\n \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n \"acc_stderr\": 0.030944459778533207,\n \"acc_norm\": 0.2046783625730994,\n \"acc_norm_stderr\": 0.030944459778533207\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023496,\n \"mc2\": 0.41575126598869544,\n \"mc2_stderr\": 0.015079894627974334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.01404109666434433\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01061410159211524,\n \"acc_stderr\": 0.002822713322387704\n }\n}\n```", "repo_url": "https://huggingface.co/qblocks/gpt2_137m_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T07-48-29.644069.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["**/details_harness|winogrande|5_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T07-48-29.644069.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T07_48_29.644069", "path": ["results_2024-01-05T07-48-29.644069.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T07-48-29.644069.parquet"]}]}]} | 2024-01-05T07:50:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qblocks/gpt2_137m_DolphinCoder
Dataset automatically created during the evaluation run of model qblocks/gpt2_137m_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T07:48:29.644069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qblocks/gpt2_137m_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/gpt2_137m_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:48:29.644069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qblocks/gpt2_137m_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/gpt2_137m_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T07:48:29.644069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of qblocks/gpt2_137m_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/gpt2_137m_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T07:48:29.644069(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
8ac0206c9d49f2d19ea69c87bea483ed975b86c0 |
# Dataset Card for Evaluation run of qblocks/codellama_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qblocks/codellama_7b_DolphinCoder](https://huggingface.co/qblocks/codellama_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__codellama_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:13:34.391359](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__codellama_7b_DolphinCoder/blob/main/results_2024-01-05T08-13-34.391359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3827239446273333,
"acc_stderr": 0.034226432114737984,
"acc_norm": 0.3863708183260275,
"acc_norm_stderr": 0.03501715050425477,
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|arc:challenge|25": {
"acc": 0.39761092150170646,
"acc_stderr": 0.014301752223279536,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.49432383987253536,
"acc_stderr": 0.004989459871609184,
"acc_norm": 0.6550487950607449,
"acc_norm_stderr": 0.004743808792037848
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.038781398887976104,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.038781398887976104
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33962264150943394,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.33962264150943394,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364397,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364397
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2978723404255319,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.2978723404255319,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36774193548387096,
"acc_stderr": 0.02743086657997347,
"acc_norm": 0.36774193548387096,
"acc_norm_stderr": 0.02743086657997347
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.030315099285617715,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.030315099285617715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4121212121212121,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.4121212121212121,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.45595854922279794,
"acc_stderr": 0.035944137112724366,
"acc_norm": 0.45595854922279794,
"acc_norm_stderr": 0.035944137112724366
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073328,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073328
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.48073394495412847,
"acc_stderr": 0.021421402982548878,
"acc_norm": 0.48073394495412847,
"acc_norm_stderr": 0.021421402982548878
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4810126582278481,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.4810126582278481,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4080717488789238,
"acc_stderr": 0.03298574607842821,
"acc_norm": 0.4080717488789238,
"acc_norm_stderr": 0.03298574607842821
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.33587786259541985,
"acc_stderr": 0.04142313771996665,
"acc_norm": 0.33587786259541985,
"acc_norm_stderr": 0.04142313771996665
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4380165289256198,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.4380165289256198,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.03847021420456026,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.03847021420456026
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.47572815533980584,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.47572815533980584,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5726495726495726,
"acc_stderr": 0.032408473935163266,
"acc_norm": 0.5726495726495726,
"acc_norm_stderr": 0.032408473935163266
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.017784034534992433,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.017784034534992433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331154,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331154
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02845263998508801,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02845263998508801
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4180064308681672,
"acc_stderr": 0.028013651891995072,
"acc_norm": 0.4180064308681672,
"acc_norm_stderr": 0.028013651891995072
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607715,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607715
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.30141843971631205,
"acc_stderr": 0.027374128882631146,
"acc_norm": 0.30141843971631205,
"acc_norm_stderr": 0.027374128882631146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.303129074315515,
"acc_stderr": 0.011738669951254293,
"acc_norm": 0.303129074315515,
"acc_norm_stderr": 0.011738669951254293
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.01933314202079706,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.01933314202079706
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5124378109452736,
"acc_stderr": 0.03534439848539579,
"acc_norm": 0.5124378109452736,
"acc_norm_stderr": 0.03534439848539579
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.47953216374269003,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.47953216374269003,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22031823745410037,
"mc1_stderr": 0.014509045171487291,
"mc2": 0.35450592505891126,
"mc2_stderr": 0.014292262562897113
},
"harness|winogrande|5": {
"acc": 0.6361483820047356,
"acc_stderr": 0.013521488896883408
},
"harness|gsm8k|5": {
"acc": 0.09704321455648218,
"acc_stderr": 0.008153768274554735
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qblocks__codellama_7b_DolphinCoder | [
"region:us"
] | 2024-01-05T08:15:58+00:00 | {"pretty_name": "Evaluation run of qblocks/codellama_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [qblocks/codellama_7b_DolphinCoder](https://huggingface.co/qblocks/codellama_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__codellama_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T08:13:34.391359](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__codellama_7b_DolphinCoder/blob/main/results_2024-01-05T08-13-34.391359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3827239446273333,\n \"acc_stderr\": 0.034226432114737984,\n \"acc_norm\": 0.3863708183260275,\n \"acc_norm_stderr\": 0.03501715050425477,\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.39761092150170646,\n \"acc_stderr\": 0.014301752223279536,\n \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303026\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49432383987253536,\n \"acc_stderr\": 0.004989459871609184,\n \"acc_norm\": 0.6550487950607449,\n \"acc_norm_stderr\": 0.004743808792037848\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.038781398887976104,\n \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.038781398887976104\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33962264150943394,\n \"acc_stderr\": 0.02914690474779833,\n \"acc_norm\": 0.33962264150943394,\n \"acc_norm_stderr\": 0.02914690474779833\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364397,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364397\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2978723404255319,\n \"acc_stderr\": 0.029896145682095455,\n \"acc_norm\": 0.2978723404255319,\n \"acc_norm_stderr\": 0.029896145682095455\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36774193548387096,\n \"acc_stderr\": 0.02743086657997347,\n \"acc_norm\": 0.36774193548387096,\n \"acc_norm_stderr\": 0.02743086657997347\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.24630541871921183,\n \"acc_stderr\": 0.030315099285617715,\n \"acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.030315099285617715\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4121212121212121,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.4121212121212121,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.45595854922279794,\n \"acc_stderr\": 0.035944137112724366,\n \"acc_norm\": 0.45595854922279794,\n \"acc_norm_stderr\": 0.035944137112724366\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073328,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073328\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.48073394495412847,\n \"acc_stderr\": 0.021421402982548878,\n \"acc_norm\": 0.48073394495412847,\n \"acc_norm_stderr\": 0.021421402982548878\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4810126582278481,\n \"acc_stderr\": 0.03252375148090448,\n \"acc_norm\": 0.4810126582278481,\n \"acc_norm_stderr\": 0.03252375148090448\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4080717488789238,\n \"acc_stderr\": 0.03298574607842821,\n \"acc_norm\": 0.4080717488789238,\n \"acc_norm_stderr\": 0.03298574607842821\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.33587786259541985,\n \"acc_stderr\": 0.04142313771996665,\n \"acc_norm\": 0.33587786259541985,\n \"acc_norm_stderr\": 0.04142313771996665\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.4380165289256198,\n \"acc_stderr\": 0.045291468044357915,\n \"acc_norm\": 0.4380165289256198,\n \"acc_norm_stderr\": 0.045291468044357915\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.03847021420456026,\n \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.03847021420456026\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.47572815533980584,\n \"acc_stderr\": 0.049449010929737795,\n \"acc_norm\": 0.47572815533980584,\n \"acc_norm_stderr\": 0.049449010929737795\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5726495726495726,\n \"acc_stderr\": 0.032408473935163266,\n \"acc_norm\": 0.5726495726495726,\n \"acc_norm_stderr\": 0.032408473935163266\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.017784034534992433,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.017784034534992433\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.026362437574546545,\n \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.026362437574546545\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331154,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331154\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02845263998508801,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02845263998508801\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4180064308681672,\n \"acc_stderr\": 0.028013651891995072,\n \"acc_norm\": 0.4180064308681672,\n \"acc_norm_stderr\": 0.028013651891995072\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607715,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607715\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.30141843971631205,\n \"acc_stderr\": 0.027374128882631146,\n \"acc_norm\": 0.30141843971631205,\n \"acc_norm_stderr\": 0.027374128882631146\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.303129074315515,\n \"acc_stderr\": 0.011738669951254293,\n \"acc_norm\": 0.303129074315515,\n \"acc_norm_stderr\": 0.011738669951254293\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411955,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411955\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.01933314202079706,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.01933314202079706\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827424,\n \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827424\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5124378109452736,\n \"acc_stderr\": 0.03534439848539579,\n \"acc_norm\": 0.5124378109452736,\n \"acc_norm_stderr\": 0.03534439848539579\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.47953216374269003,\n \"acc_stderr\": 0.038316105328219316,\n \"acc_norm\": 0.47953216374269003,\n \"acc_norm_stderr\": 0.038316105328219316\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22031823745410037,\n \"mc1_stderr\": 0.014509045171487291,\n \"mc2\": 0.35450592505891126,\n \"mc2_stderr\": 0.014292262562897113\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6361483820047356,\n \"acc_stderr\": 0.013521488896883408\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09704321455648218,\n \"acc_stderr\": 0.008153768274554735\n }\n}\n```", "repo_url": "https://huggingface.co/qblocks/codellama_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-13-34.391359.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["**/details_harness|winogrande|5_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T08-13-34.391359.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T08_13_34.391359", "path": ["results_2024-01-05T08-13-34.391359.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T08-13-34.391359.parquet"]}]}]} | 2024-01-05T08:16:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qblocks/codellama_7b_DolphinCoder
Dataset automatically created during the evaluation run of model qblocks/codellama_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T08:13:34.391359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qblocks/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:13:34.391359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qblocks/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:13:34.391359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of qblocks/codellama_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/codellama_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T08:13:34.391359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
155990dcca4689752f17726b257fb9c695ed94d1 |
# Dataset Card for Evaluation run of qblocks/falcon_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qblocks/falcon_7b_DolphinCoder](https://huggingface.co/qblocks/falcon_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__falcon_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:20:23.826354](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__falcon_7b_DolphinCoder/blob/main/results_2024-01-05T08-20-23.826354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28247468876632786,
"acc_stderr": 0.03156556817285131,
"acc_norm": 0.28312300073590296,
"acc_norm_stderr": 0.032303512019122835,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.35117580451709535,
"mc2_stderr": 0.013551047154306205
},
"harness|arc:challenge|25": {
"acc": 0.45307167235494883,
"acc_stderr": 0.014546892052005631,
"acc_norm": 0.4872013651877133,
"acc_norm_stderr": 0.014606603181012538
},
"harness|hellaswag|10": {
"acc": 0.5855407289384584,
"acc_stderr": 0.004916216503770337,
"acc_norm": 0.7803226448914559,
"acc_norm_stderr": 0.004131818797713878
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.0277242364927009,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.0277242364927009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641144,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641144
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.021502096078229147,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.021502096078229147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924317,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924317
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.028335609732463345,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.028335609732463345
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752937,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752937
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.026991454502036733,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.026991454502036733
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3247863247863248,
"acc_stderr": 0.030679022765498835,
"acc_norm": 0.3247863247863248,
"acc_norm_stderr": 0.030679022765498835
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.01598281477469563,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.01598281477469563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2947976878612717,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.2947976878612717,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767857,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767857
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.026004800363952113,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.026004800363952113
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22508038585209003,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.22508038585209003,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.02555765398186805,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.02555765398186805
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340461004,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340461004
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.01091640673547895,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.01091640673547895
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.02714627193662517,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.02714627193662517
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.017952449196987862,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.017952449196987862
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3283582089552239,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.3283582089552239,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3373493975903614,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.3373493975903614,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.34502923976608185,
"acc_stderr": 0.036459813773888065,
"acc_norm": 0.34502923976608185,
"acc_norm_stderr": 0.036459813773888065
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.35117580451709535,
"mc2_stderr": 0.013551047154306205
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754765
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878086
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qblocks__falcon_7b_DolphinCoder | [
"region:us"
] | 2024-01-05T08:22:04+00:00 | {"pretty_name": "Evaluation run of qblocks/falcon_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [qblocks/falcon_7b_DolphinCoder](https://huggingface.co/qblocks/falcon_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__falcon_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T08:20:23.826354](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__falcon_7b_DolphinCoder/blob/main/results_2024-01-05T08-20-23.826354.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28247468876632786,\n \"acc_stderr\": 0.03156556817285131,\n \"acc_norm\": 0.28312300073590296,\n \"acc_norm_stderr\": 0.032303512019122835,\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.35117580451709535,\n \"mc2_stderr\": 0.013551047154306205\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.45307167235494883,\n \"acc_stderr\": 0.014546892052005631,\n \"acc_norm\": 0.4872013651877133,\n \"acc_norm_stderr\": 0.014606603181012538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5855407289384584,\n \"acc_stderr\": 0.004916216503770337,\n \"acc_norm\": 0.7803226448914559,\n \"acc_norm_stderr\": 0.004131818797713878\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641144,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641144\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378949,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378949\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.22486772486772486,\n \"acc_stderr\": 0.021502096078229147,\n \"acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.021502096078229147\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924317,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924317\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.19696969696969696,\n \"acc_stderr\": 0.028335609732463345,\n \"acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.028335609732463345\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752937,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752937\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19444444444444445,\n \"acc_stderr\": 0.026991454502036733,\n \"acc_norm\": 0.19444444444444445,\n \"acc_norm_stderr\": 0.026991454502036733\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29901960784313725,\n \"acc_stderr\": 0.032133257173736156,\n \"acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.032133257173736156\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.040933292298342784,\n \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.040933292298342784\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3247863247863248,\n \"acc_stderr\": 0.030679022765498835,\n \"acc_norm\": 0.3247863247863248,\n \"acc_norm_stderr\": 0.030679022765498835\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2947976878612717,\n \"acc_stderr\": 0.02454761779480383,\n \"acc_norm\": 0.2947976878612717,\n \"acc_norm_stderr\": 0.02454761779480383\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.014355911964767857,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.014355911964767857\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.026004800363952113,\n \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.026004800363952113\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22508038585209003,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.22508038585209003,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.02555765398186805,\n \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.02555765398186805\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340461004,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340461004\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n \"acc_stderr\": 0.01091640673547895,\n \"acc_norm\": 0.2405475880052151,\n \"acc_norm_stderr\": 0.01091640673547895\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.02714627193662517,\n \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.02714627193662517\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2696078431372549,\n \"acc_stderr\": 0.017952449196987862,\n \"acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.017952449196987862\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3283582089552239,\n \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.3283582089552239,\n \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3373493975903614,\n \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.3373493975903614,\n \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.34502923976608185,\n \"acc_stderr\": 0.036459813773888065,\n \"acc_norm\": 0.34502923976608185,\n \"acc_norm_stderr\": 0.036459813773888065\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.35117580451709535,\n \"mc2_stderr\": 0.013551047154306205\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754765\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878086\n }\n}\n```", "repo_url": "https://huggingface.co/qblocks/falcon_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-20-23.826354.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["**/details_harness|winogrande|5_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T08-20-23.826354.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T08_20_23.826354", "path": ["results_2024-01-05T08-20-23.826354.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T08-20-23.826354.parquet"]}]}]} | 2024-01-05T08:22:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qblocks/falcon_7b_DolphinCoder
Dataset automatically created during the evaluation run of model qblocks/falcon_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T08:20:23.826354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qblocks/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:20:23.826354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qblocks/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:20:23.826354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of qblocks/falcon_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/falcon_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T08:20:23.826354(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
2a3a04c42852417f9422e6bafd84f14c2bc9ccd6 |
# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [qblocks/mistral_7b_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:38:41.844099](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder/blob/main/results_2024-01-05T08-38-41.844099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5955742366975546,
"acc_stderr": 0.032892026757812796,
"acc_norm": 0.6023520874451797,
"acc_norm_stderr": 0.03357558761791825,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196204,
"acc_norm": 0.5972696245733788,
"acc_norm_stderr": 0.014332236306790149
},
"harness|hellaswag|10": {
"acc": 0.628460466042621,
"acc_stderr": 0.004822286556305222,
"acc_norm": 0.8163712407886875,
"acc_norm_stderr": 0.003863898546941602
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.02944517532819959,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.02944517532819959
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250948,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250948
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.033809398139433545,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.033809398139433545
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709583,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.01435591196476786,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.01435591196476786
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666787,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.019691459052354015,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.019691459052354015
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.43954153886534947,
"mc2_stderr": 0.014894783303440727
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708267
},
"harness|gsm8k|5": {
"acc": 0.2623199393479909,
"acc_stderr": 0.012116912419925704
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder | [
"region:us"
] | 2024-01-05T08:40:58+00:00 | {"pretty_name": "Evaluation run of qblocks/mistral_7b_DolphinCoder", "dataset_summary": "Dataset automatically created during the evaluation run of model [qblocks/mistral_7b_DolphinCoder](https://huggingface.co/qblocks/mistral_7b_DolphinCoder) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T08:38:41.844099](https://huggingface.co/datasets/open-llm-leaderboard/details_qblocks__mistral_7b_DolphinCoder/blob/main/results_2024-01-05T08-38-41.844099.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5955742366975546,\n \"acc_stderr\": 0.032892026757812796,\n \"acc_norm\": 0.6023520874451797,\n \"acc_norm_stderr\": 0.03357558761791825,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196204,\n \"acc_norm\": 0.5972696245733788,\n \"acc_norm_stderr\": 0.014332236306790149\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.628460466042621,\n \"acc_stderr\": 0.004822286556305222,\n \"acc_norm\": 0.8163712407886875,\n \"acc_norm_stderr\": 0.003863898546941602\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.02944517532819959,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.02944517532819959\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.6903225806451613,\n \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.017149858514250948,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.017149858514250948\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.033809398139433545,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.033809398139433545\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.014866821664709583,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.014866821664709583\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n \"acc_stderr\": 0.01435591196476786,\n \"acc_norm\": 0.2435754189944134,\n \"acc_norm_stderr\": 0.01435591196476786\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666787,\n \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.019691459052354015,\n \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.019691459052354015\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.43954153886534947,\n \"mc2_stderr\": 0.014894783303440727\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708267\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2623199393479909,\n \"acc_stderr\": 0.012116912419925704\n }\n}\n```", "repo_url": "https://huggingface.co/qblocks/mistral_7b_DolphinCoder", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["**/details_harness|winogrande|5_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T08-38-41.844099.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T08_38_41.844099", "path": ["results_2024-01-05T08-38-41.844099.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T08-38-41.844099.parquet"]}]}]} | 2024-01-05T08:41:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder
Dataset automatically created during the evaluation run of model qblocks/mistral_7b_DolphinCoder on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T08:38:41.844099(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:38:41.844099(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:38:41.844099(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of qblocks/mistral_7b_DolphinCoder\n\n\n\nDataset automatically created during the evaluation run of model qblocks/mistral_7b_DolphinCoder on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T08:38:41.844099(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
e3c60fe944d4d73a352a2328a20638d456498a16 |
# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vihangd/smartsolmix-4x10.7b-v1](https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T08:52:41.511718](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1/blob/main/results_2024-01-05T08-52-41.511718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6617320765919033,
"acc_stderr": 0.031609550329954696,
"acc_norm": 0.6640092521869942,
"acc_norm_stderr": 0.032248539123169905,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5503302799032582,
"mc2_stderr": 0.015375535036682436
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.014312094557946707,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.660426209918343,
"acc_stderr": 0.004725967684806407,
"acc_norm": 0.8513244373630751,
"acc_norm_stderr": 0.003550412891647448
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777028,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777028
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822513,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822513
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.023559646983189946,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.023559646983189946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188703,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188703
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.023094329582595698,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.023094329582595698
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.015839400406212494,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.015839400406212494
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49674054758800523,
"acc_stderr": 0.012769964760343309,
"acc_norm": 0.49674054758800523,
"acc_norm_stderr": 0.012769964760343309
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887664,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887664
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.018745011201277657,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.018745011201277657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904017,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904017
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587952,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587952
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5503302799032582,
"mc2_stderr": 0.015375535036682436
},
"harness|winogrande|5": {
"acc": 0.8342541436464088,
"acc_stderr": 0.01045089954537063
},
"harness|gsm8k|5": {
"acc": 0.5943896891584534,
"acc_stderr": 0.013524848894462115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1 | [
"region:us"
] | 2024-01-05T08:54:59+00:00 | {"pretty_name": "Evaluation run of vihangd/smartsolmix-4x10.7b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vihangd/smartsolmix-4x10.7b-v1](https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T08:52:41.511718](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__smartsolmix-4x10.7b-v1/blob/main/results_2024-01-05T08-52-41.511718.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6617320765919033,\n \"acc_stderr\": 0.031609550329954696,\n \"acc_norm\": 0.6640092521869942,\n \"acc_norm_stderr\": 0.032248539123169905,\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5503302799032582,\n \"mc2_stderr\": 0.015375535036682436\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.014312094557946707,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.660426209918343,\n \"acc_stderr\": 0.004725967684806407,\n \"acc_norm\": 0.8513244373630751,\n \"acc_norm_stderr\": 0.003550412891647448\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777028,\n \"acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777028\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822513,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822513\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.023559646983189946,\n \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.023559646983189946\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188703,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188703\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8523206751054853,\n \"acc_stderr\": 0.023094329582595698,\n \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.023094329582595698\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n \"acc_stderr\": 0.015839400406212494,\n \"acc_norm\": 0.3396648044692737,\n \"acc_norm_stderr\": 0.015839400406212494\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008553,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008553\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49674054758800523,\n \"acc_stderr\": 0.012769964760343309,\n \"acc_norm\": 0.49674054758800523,\n \"acc_norm_stderr\": 0.012769964760343309\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887664,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887664\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6879084967320261,\n \"acc_stderr\": 0.018745011201277657,\n \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.018745011201277657\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904017,\n \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904017\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587952,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587952\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5503302799032582,\n \"mc2_stderr\": 0.015375535036682436\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8342541436464088,\n \"acc_stderr\": 0.01045089954537063\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5943896891584534,\n \"acc_stderr\": 0.013524848894462115\n }\n}\n```", "repo_url": "https://huggingface.co/vihangd/smartsolmix-4x10.7b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["**/details_harness|winogrande|5_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T08-52-41.511718.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T08_52_41.511718", "path": ["results_2024-01-05T08-52-41.511718.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T08-52-41.511718.parquet"]}]}]} | 2024-01-05T08:55:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1
Dataset automatically created during the evaluation run of model vihangd/smartsolmix-4x10.7b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T08:52:41.511718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/smartsolmix-4x10.7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:52:41.511718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/smartsolmix-4x10.7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T08:52:41.511718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vihangd/smartsolmix-4x10.7b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/smartsolmix-4x10.7b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T08:52:41.511718(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
4a94e82607e19e60473d7d2da8c7f50127c1c29d |
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-20B](https://huggingface.co/KnutJaegersberg/Deacon-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:05:17.184238](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B/blob/main/results_2024-01-05T09-05-17.184238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6041366068244398,
"acc_stderr": 0.032898915535709075,
"acc_norm": 0.6106134294937929,
"acc_norm_stderr": 0.033580635198863264,
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5848788971105185,
"mc2_stderr": 0.01542200303332033
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.014484703048857362,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.014269634635670728
},
"harness|hellaswag|10": {
"acc": 0.6254730133439554,
"acc_stderr": 0.004830113797327048,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415433
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365252,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365252
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764812,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764812
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198913,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198913
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276877,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276877
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945273,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945273
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457964,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457964
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489122,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489122
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175372,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2245810055865922,
"acc_stderr": 0.013956803666544641,
"acc_norm": 0.2245810055865922,
"acc_norm_stderr": 0.013956803666544641
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291477,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922442,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411127,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411127
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328903,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328903
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.0337997668989631,
"acc_norm": 0.87,
"acc_norm_stderr": 0.0337997668989631
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40758873929008566,
"mc1_stderr": 0.017201949234553104,
"mc2": 0.5848788971105185,
"mc2_stderr": 0.01542200303332033
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827933
},
"harness|gsm8k|5": {
"acc": 0.2918877937831691,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B | [
"region:us"
] | 2024-01-05T09:07:20+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Deacon-20B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Deacon-20B](https://huggingface.co/KnutJaegersberg/Deacon-20B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:05:17.184238](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Deacon-20B/blob/main/results_2024-01-05T09-05-17.184238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6041366068244398,\n \"acc_stderr\": 0.032898915535709075,\n \"acc_norm\": 0.6106134294937929,\n \"acc_norm_stderr\": 0.033580635198863264,\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5848788971105185,\n \"mc2_stderr\": 0.01542200303332033\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.014484703048857362,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.014269634635670728\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6254730133439554,\n \"acc_stderr\": 0.004830113797327048,\n \"acc_norm\": 0.8173670583549094,\n \"acc_norm_stderr\": 0.0038557568514415433\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365252,\n \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365252\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n \"acc_stderr\": 0.024993053397764812,\n \"acc_norm\": 0.7387096774193549,\n \"acc_norm_stderr\": 0.024993053397764812\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198913,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198913\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276877,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276877\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945273,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945273\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.031866081214088314,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.031866081214088314\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457964,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457964\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489122,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489122\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.02336505149175372,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.02336505149175372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2245810055865922,\n \"acc_stderr\": 0.013956803666544641,\n \"acc_norm\": 0.2245810055865922,\n \"acc_norm_stderr\": 0.013956803666544641\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291477,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291477\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n \"acc_stderr\": 0.012750151802922442,\n \"acc_norm\": 0.47196870925684486,\n \"acc_norm_stderr\": 0.012750151802922442\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411127,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411127\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128438,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128438\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n \"acc_stderr\": 0.028996909693328903,\n \"acc_norm\": 0.7860696517412935,\n \"acc_norm_stderr\": 0.028996909693328903\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.0337997668989631,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.0337997668989631\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686399,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686399\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40758873929008566,\n \"mc1_stderr\": 0.017201949234553104,\n \"mc2\": 0.5848788971105185,\n \"mc2_stderr\": 0.01542200303332033\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827933\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2918877937831691,\n \"acc_stderr\": 0.012522795894420869\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Deacon-20B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["**/details_harness|winogrande|5_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-05-17.184238.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_05_17.184238", "path": ["results_2024-01-05T09-05-17.184238.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-05-17.184238.parquet"]}]}]} | 2024-01-05T09:07:42+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B
Dataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-20B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:05:17.184238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-20B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:05:17.184238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-20B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:05:17.184238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Deacon-20B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Deacon-20B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:05:17.184238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
2bf1f5cc72beee057bb4d02d09a7d49ee3d708d6 | # Dataset Card for "full_sft_chat_data_filtered_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | haisonle001/full_sft_chat_data_filtered_final | [
"region:us"
] | 2024-01-05T09:08:51+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train_sft", "path": "data/train_sft-*"}]}], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "prompt_id", "dtype": "string"}, {"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train_sft", "num_bytes": 9893403961.87305, "num_examples": 5301513}], "download_size": 5178718630, "dataset_size": 9893403961.87305}} | 2024-01-05T09:17:32+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "full_sft_chat_data_filtered_final"
More Information needed | [
"# Dataset Card for \"full_sft_chat_data_filtered_final\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"full_sft_chat_data_filtered_final\"\n\nMore Information needed"
] | [
6,
23
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"full_sft_chat_data_filtered_final\"\n\nMore Information needed"
] |
cd51b601dcfdb3a830662b099e71e95e1af8f9ce | # Dataset Card for "pubmed-finetuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anumafzal94/pubmed-finetuning | [
"region:us"
] | 2024-01-05T09:26:09+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 126100060, "num_examples": 6592}, {"name": "train", "num_bytes": 96822834.85018493, "num_examples": 5000}, {"name": "validation", "num_bytes": 126319645, "num_examples": 6559}], "download_size": 25825883, "dataset_size": 349242539.8501849}} | 2024-01-05T13:35:29+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pubmed-finetuning"
More Information needed | [
"# Dataset Card for \"pubmed-finetuning\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmed-finetuning\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmed-finetuning\"\n\nMore Information needed"
] |
8218c96379db327f0a094b342bf50202691413c1 | # Dataset Card for "arxiv-finetuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | anumafzal94/arxiv-finetuning | [
"region:us"
] | 2024-01-05T09:26:29+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 216556336, "num_examples": 6438}, {"name": "train", "num_bytes": 176330440.40062648, "num_examples": 5000}, {"name": "validation", "num_bytes": 216170606, "num_examples": 6434}], "download_size": 236607847, "dataset_size": 609057382.4006264}} | 2024-01-05T15:46:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "arxiv-finetuning"
More Information needed | [
"# Dataset Card for \"arxiv-finetuning\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"arxiv-finetuning\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"arxiv-finetuning\"\n\nMore Information needed"
] |
5ed83ac66c9b33c916204dd1c259e1d3ae735a2a |
# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-LoRA-V1.4](https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:27:55.373220](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4/blob/main/results_2024-01-05T09-27-55.373220.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.767579679155859,
"acc_stderr": 0.028032995667067143,
"acc_norm": 0.7712502289161597,
"acc_norm_stderr": 0.028568503426373924,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.626557569821075,
"mc2_stderr": 0.01486734061588185
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283507,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292037
},
"harness|hellaswag|10": {
"acc": 0.6597291376219877,
"acc_stderr": 0.004728318577835205,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.0035562912320503525
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474928,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474928
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8415094339622642,
"acc_stderr": 0.022476528710167726,
"acc_norm": 0.8415094339622642,
"acc_norm_stderr": 0.022476528710167726
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9236111111111112,
"acc_stderr": 0.02221220393834591,
"acc_norm": 0.9236111111111112,
"acc_norm_stderr": 0.02221220393834591
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7745664739884393,
"acc_stderr": 0.031862098516411454,
"acc_norm": 0.7745664739884393,
"acc_norm_stderr": 0.031862098516411454
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.774468085106383,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.774468085106383,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6228070175438597,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.6228070175438597,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8137931034482758,
"acc_stderr": 0.032439461590046174,
"acc_norm": 0.8137931034482758,
"acc_norm_stderr": 0.032439461590046174
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6534391534391535,
"acc_stderr": 0.024508777521028428,
"acc_norm": 0.6534391534391535,
"acc_norm_stderr": 0.024508777521028428
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432306,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432306
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.01826310542019951,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.01826310542019951
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909046,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909046
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.0195652367829309,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.0195652367829309
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.02327425589870794,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.02327425589870794
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5496688741721855,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.5496688741721855,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.030998666304560534,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.030998666304560534
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597453,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597453
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622804,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622804
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517962,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331362,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331362
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253874,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253874
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9195402298850575,
"acc_stderr": 0.009726831316141866,
"acc_norm": 0.9195402298850575,
"acc_norm_stderr": 0.009726831316141866
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8410404624277457,
"acc_stderr": 0.019685307033571946,
"acc_norm": 0.8410404624277457,
"acc_norm_stderr": 0.019685307033571946
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6871508379888268,
"acc_stderr": 0.015506892594647258,
"acc_norm": 0.6871508379888268,
"acc_norm_stderr": 0.015506892594647258
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8464052287581699,
"acc_stderr": 0.02064559791041878,
"acc_norm": 0.8464052287581699,
"acc_norm_stderr": 0.02064559791041878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.021193872528034972,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.021193872528034972
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8765432098765432,
"acc_stderr": 0.018303868806891787,
"acc_norm": 0.8765432098765432,
"acc_norm_stderr": 0.018303868806891787
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6382978723404256,
"acc_stderr": 0.02866382014719949,
"acc_norm": 0.6382978723404256,
"acc_norm_stderr": 0.02866382014719949
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6095176010430248,
"acc_stderr": 0.012460135913945066,
"acc_norm": 0.6095176010430248,
"acc_norm_stderr": 0.012460135913945066
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.022368672562886747,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.022368672562886747
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8120915032679739,
"acc_stderr": 0.015803565736776676,
"acc_norm": 0.8120915032679739,
"acc_norm_stderr": 0.015803565736776676
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8285714285714286,
"acc_stderr": 0.024127463462650153,
"acc_norm": 0.8285714285714286,
"acc_norm_stderr": 0.024127463462650153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5903614457831325,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.5903614457831325,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.626557569821075,
"mc2_stderr": 0.01486734061588185
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343345
},
"harness|gsm8k|5": {
"acc": 0.7020470053070508,
"acc_stderr": 0.012597932232914529
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_moreh__MoMo-70B-LoRA-V1.4 | [
"region:us"
] | 2024-01-05T09:30:08+00:00 | {"pretty_name": "Evaluation run of moreh/MoMo-72B-LoRA-V1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [moreh/MoMo-72B-LoRA-V1.4](https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:27:55.373220](https://huggingface.co/datasets/open-llm-leaderboard/details_moreh__MoMo-72B-LoRA-V1.4/blob/main/results_2024-01-05T09-27-55.373220.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.767579679155859,\n \"acc_stderr\": 0.028032995667067143,\n \"acc_norm\": 0.7712502289161597,\n \"acc_norm_stderr\": 0.028568503426373924,\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.626557569821075,\n \"mc2_stderr\": 0.01486734061588185\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283507,\n \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292037\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6597291376219877,\n \"acc_stderr\": 0.004728318577835205,\n \"acc_norm\": 0.8507269468233419,\n \"acc_norm_stderr\": 0.0035562912320503525\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474928,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474928\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8415094339622642,\n \"acc_stderr\": 0.022476528710167726,\n \"acc_norm\": 0.8415094339622642,\n \"acc_norm_stderr\": 0.022476528710167726\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9236111111111112,\n \"acc_stderr\": 0.02221220393834591,\n \"acc_norm\": 0.9236111111111112,\n \"acc_norm_stderr\": 0.02221220393834591\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.031862098516411454,\n \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.031862098516411454\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.774468085106383,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.774468085106383,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6228070175438597,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.6228070175438597,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.8137931034482758,\n \"acc_stderr\": 0.032439461590046174,\n \"acc_norm\": 0.8137931034482758,\n \"acc_norm_stderr\": 0.032439461590046174\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6534391534391535,\n \"acc_stderr\": 0.024508777521028428,\n \"acc_norm\": 0.6534391534391535,\n \"acc_norm_stderr\": 0.024508777521028428\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n \"acc_stderr\": 0.018225757949432306,\n \"acc_norm\": 0.8838709677419355,\n \"acc_norm_stderr\": 0.018225757949432306\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.01826310542019951,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.01826310542019951\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909046,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909046\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.0195652367829309,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.0195652367829309\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.02327425589870794,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.02327425589870794\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5496688741721855,\n \"acc_stderr\": 0.04062290018683775,\n \"acc_norm\": 0.5496688741721855,\n \"acc_norm_stderr\": 0.04062290018683775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.030998666304560534,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.030998666304560534\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597453,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597453\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622804,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622804\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517962,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517962\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331362,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331362\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253874,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253874\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9195402298850575,\n \"acc_stderr\": 0.009726831316141866,\n \"acc_norm\": 0.9195402298850575,\n \"acc_norm_stderr\": 0.009726831316141866\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8410404624277457,\n \"acc_stderr\": 0.019685307033571946,\n \"acc_norm\": 0.8410404624277457,\n \"acc_norm_stderr\": 0.019685307033571946\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6871508379888268,\n \"acc_stderr\": 0.015506892594647258,\n \"acc_norm\": 0.6871508379888268,\n \"acc_norm_stderr\": 0.015506892594647258\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8464052287581699,\n \"acc_stderr\": 0.02064559791041878,\n \"acc_norm\": 0.8464052287581699,\n \"acc_norm_stderr\": 0.02064559791041878\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.021193872528034972,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.021193872528034972\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8765432098765432,\n \"acc_stderr\": 0.018303868806891787,\n \"acc_norm\": 0.8765432098765432,\n \"acc_norm_stderr\": 0.018303868806891787\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6382978723404256,\n \"acc_stderr\": 0.02866382014719949,\n \"acc_norm\": 0.6382978723404256,\n \"acc_norm_stderr\": 0.02866382014719949\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6095176010430248,\n \"acc_stderr\": 0.012460135913945066,\n \"acc_norm\": 0.6095176010430248,\n \"acc_norm_stderr\": 0.012460135913945066\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8382352941176471,\n \"acc_stderr\": 0.022368672562886747,\n \"acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.022368672562886747\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8120915032679739,\n \"acc_stderr\": 0.015803565736776676,\n \"acc_norm\": 0.8120915032679739,\n \"acc_norm_stderr\": 0.015803565736776676\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8285714285714286,\n \"acc_stderr\": 0.024127463462650153,\n \"acc_norm\": 0.8285714285714286,\n \"acc_norm_stderr\": 0.024127463462650153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5903614457831325,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.5903614457831325,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.626557569821075,\n \"mc2_stderr\": 0.01486734061588185\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343345\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7020470053070508,\n \"acc_stderr\": 0.012597932232914529\n }\n}\n```", "repo_url": "https://huggingface.co/moreh/MoMo-72B-LoRA-V1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-27-55.373220.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["**/details_harness|winogrande|5_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-27-55.373220.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_27_55.373220", "path": ["results_2024-01-05T09-27-55.373220.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-27-55.373220.parquet"]}]}]} | 2024-01-24T09:51:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4
Dataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:27:55.373220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:27:55.373220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:27:55.373220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of moreh/MoMo-72B-LoRA-V1.4\n\n\n\nDataset automatically created during the evaluation run of model moreh/MoMo-72B-LoRA-V1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:27:55.373220(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b7d52267f8339464398bdc332d7e475caf5cb3f4 | # Dataset Card for Dataset for presuicidal signal detection
<!-- Provide a quick summary of the dataset. -->
This dataset dedicated to find texts that contain information that helps to diagnosis person's suicide rating.
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Igor Buyanov ([email protected])
- **Language(s) (NLP):** Russian
- **License:** MIT
### Dataset Sources
<!-- Provide the basic links for the dataset. -->
- **Repository:** [link](https://data.mendeley.com/datasets/86v3z38dc7/1)
- **Paper:** [link](https://astromis.github.io/assets/pdf/buyanoviplussochenkovi046.pdf)
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
The dataset is intended to use to train the model that can help the psychologists to analyze the potential suicidal person accounts faster in order to find clues and facts that helps them in threatment.
## Dataset Structure
The dataset has two categories: the normal text (0) and text with potential useful information about person's suicide signals (1). These signals are:
* Texts describing negative events that occurred with the subject in the past or in the present - messages that are factual, describing negative moments that can happen to a person, such as attempts and facts of rape, problems with parents, the fact of being in a psychiatric hospital, facts of self-harm, etc.
* Current negative emotional state - messages containing a display of subjective negative attitude towards oneself and others, including a desire to die, a feeling of pressure from the past, self-hatred, aggressiveness, rage directed at oneself or others.
Note that source dataset that was pointed in **Repository** contains five categories. Due to unrepresentation of some categories and extremeimbalance, the dataset were transformed to have only two categories. See the paper for more details.
The dataset is splitted to train and test parts. Current count distribution is as follows:
```
DatasetDict({
train: Dataset({
features: ['text', 'label'],
num_rows: 22787
})
test: Dataset({
features: ['text', 'label'],
num_rows: 9767
})
})
```
## Dataset Creation
### Source Data
Accounts of Russian persons on Twitter that were marked as having tendency to suicide.
### Annotations
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
See the paper.
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
The dataset may contain some personal information that was shared by Twitter users themselves.
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```bibtex
@article{Buyanov2022TheDF,
title={The dataset for presuicidal signals detection in text and its analysis},
author={Igor Buyanov and Ilya Sochenkov},
journal={Computational Linguistics and Intellectual Technologies},
year={2022},
month={June},
number={21},
pages={81--92},
url={https://api.semanticscholar.org/CorpusID:253195162},
}
```
## Dataset Card Authors
Igor Buyanov
## Dataset Card Contact
[email protected] | astromis/presuicidal_signals | [
"task_categories:text-classification",
"size_categories:10K<n<100K",
"language:ru",
"license:mit",
"psyhology",
"text classification",
"suicide",
"region:us"
] | 2024-01-05T09:35:36+00:00 | {"language": ["ru"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-classification"], "pretty_name": "Dataset for presuicidal signal detection", "tags": ["psyhology", "text classification", "suicide"], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4006893, "num_examples": 22787}, {"name": "test", "num_bytes": 1721497, "num_examples": 9767}], "download_size": 3145819, "dataset_size": 5728390}} | 2024-01-05T12:43:23+00:00 | [] | [
"ru"
] | TAGS
#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-mit #psyhology #text classification #suicide #region-us
| # Dataset Card for Dataset for presuicidal signal detection
This dataset dedicated to find texts that contain information that helps to diagnosis person's suicide rating.
## Dataset Details
### Dataset Description
- Curated by: Igor Buyanov (URL.o@URL)
- Language(s) (NLP): Russian
- License: MIT
### Dataset Sources
- Repository: link
- Paper: link
## Uses
The dataset is intended to use to train the model that can help the psychologists to analyze the potential suicidal person accounts faster in order to find clues and facts that helps them in threatment.
## Dataset Structure
The dataset has two categories: the normal text (0) and text with potential useful information about person's suicide signals (1). These signals are:
* Texts describing negative events that occurred with the subject in the past or in the present - messages that are factual, describing negative moments that can happen to a person, such as attempts and facts of rape, problems with parents, the fact of being in a psychiatric hospital, facts of self-harm, etc.
* Current negative emotional state - messages containing a display of subjective negative attitude towards oneself and others, including a desire to die, a feeling of pressure from the past, self-hatred, aggressiveness, rage directed at oneself or others.
Note that source dataset that was pointed in Repository contains five categories. Due to unrepresentation of some categories and extremeimbalance, the dataset were transformed to have only two categories. See the paper for more details.
The dataset is splitted to train and test parts. Current count distribution is as follows:
## Dataset Creation
### Source Data
Accounts of Russian persons on Twitter that were marked as having tendency to suicide.
### Annotations
See the paper.
#### Personal and Sensitive Information
The dataset may contain some personal information that was shared by Twitter users themselves.
BibTeX:
## Dataset Card Authors
Igor Buyanov
## Dataset Card Contact
URL.o@URL | [
"# Dataset Card for Dataset for presuicidal signal detection\n\n\n\nThis dataset dedicated to find texts that contain information that helps to diagnosis person's suicide rating.",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Igor Buyanov (URL.o@URL)\n- Language(s) (NLP): Russian\n- License: MIT",
"### Dataset Sources\n\n\n\n- Repository: link\n- Paper: link",
"## Uses\n\n\n\nThe dataset is intended to use to train the model that can help the psychologists to analyze the potential suicidal person accounts faster in order to find clues and facts that helps them in threatment.",
"## Dataset Structure\n\nThe dataset has two categories: the normal text (0) and text with potential useful information about person's suicide signals (1). These signals are:\n* Texts describing negative events that occurred with the subject in the past or in the present - messages that are factual, describing negative moments that can happen to a person, such as attempts and facts of rape, problems with parents, the fact of being in a psychiatric hospital, facts of self-harm, etc.\n* Current negative emotional state - messages containing a display of subjective negative attitude towards oneself and others, including a desire to die, a feeling of pressure from the past, self-hatred, aggressiveness, rage directed at oneself or others.\n\nNote that source dataset that was pointed in Repository contains five categories. Due to unrepresentation of some categories and extremeimbalance, the dataset were transformed to have only two categories. See the paper for more details.\n\nThe dataset is splitted to train and test parts. Current count distribution is as follows:",
"## Dataset Creation",
"### Source Data\n\nAccounts of Russian persons on Twitter that were marked as having tendency to suicide.",
"### Annotations\n\n\n\nSee the paper.",
"#### Personal and Sensitive Information\n\n\n\nThe dataset may contain some personal information that was shared by Twitter users themselves.\n\nBibTeX:",
"## Dataset Card Authors\n\nIgor Buyanov",
"## Dataset Card Contact\n\nURL.o@URL"
] | [
"TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-mit #psyhology #text classification #suicide #region-us \n",
"# Dataset Card for Dataset for presuicidal signal detection\n\n\n\nThis dataset dedicated to find texts that contain information that helps to diagnosis person's suicide rating.",
"## Dataset Details",
"### Dataset Description\n\n\n\n- Curated by: Igor Buyanov (URL.o@URL)\n- Language(s) (NLP): Russian\n- License: MIT",
"### Dataset Sources\n\n\n\n- Repository: link\n- Paper: link",
"## Uses\n\n\n\nThe dataset is intended to use to train the model that can help the psychologists to analyze the potential suicidal person accounts faster in order to find clues and facts that helps them in threatment.",
"## Dataset Structure\n\nThe dataset has two categories: the normal text (0) and text with potential useful information about person's suicide signals (1). These signals are:\n* Texts describing negative events that occurred with the subject in the past or in the present - messages that are factual, describing negative moments that can happen to a person, such as attempts and facts of rape, problems with parents, the fact of being in a psychiatric hospital, facts of self-harm, etc.\n* Current negative emotional state - messages containing a display of subjective negative attitude towards oneself and others, including a desire to die, a feeling of pressure from the past, self-hatred, aggressiveness, rage directed at oneself or others.\n\nNote that source dataset that was pointed in Repository contains five categories. Due to unrepresentation of some categories and extremeimbalance, the dataset were transformed to have only two categories. See the paper for more details.\n\nThe dataset is splitted to train and test parts. Current count distribution is as follows:",
"## Dataset Creation",
"### Source Data\n\nAccounts of Russian persons on Twitter that were marked as having tendency to suicide.",
"### Annotations\n\n\n\nSee the paper.",
"#### Personal and Sensitive Information\n\n\n\nThe dataset may contain some personal information that was shared by Twitter users themselves.\n\nBibTeX:",
"## Dataset Card Authors\n\nIgor Buyanov",
"## Dataset Card Contact\n\nURL.o@URL"
] | [
51,
37,
4,
35,
16,
50,
240,
5,
22,
9,
29,
10,
10
] | [
"passage: TAGS\n#task_categories-text-classification #size_categories-10K<n<100K #language-Russian #license-mit #psyhology #text classification #suicide #region-us \n# Dataset Card for Dataset for presuicidal signal detection\n\n\n\nThis dataset dedicated to find texts that contain information that helps to diagnosis person's suicide rating.## Dataset Details### Dataset Description\n\n\n\n- Curated by: Igor Buyanov (URL.o@URL)\n- Language(s) (NLP): Russian\n- License: MIT### Dataset Sources\n\n\n\n- Repository: link\n- Paper: link## Uses\n\n\n\nThe dataset is intended to use to train the model that can help the psychologists to analyze the potential suicidal person accounts faster in order to find clues and facts that helps them in threatment.## Dataset Structure\n\nThe dataset has two categories: the normal text (0) and text with potential useful information about person's suicide signals (1). These signals are:\n* Texts describing negative events that occurred with the subject in the past or in the present - messages that are factual, describing negative moments that can happen to a person, such as attempts and facts of rape, problems with parents, the fact of being in a psychiatric hospital, facts of self-harm, etc.\n* Current negative emotional state - messages containing a display of subjective negative attitude towards oneself and others, including a desire to die, a feeling of pressure from the past, self-hatred, aggressiveness, rage directed at oneself or others.\n\nNote that source dataset that was pointed in Repository contains five categories. Due to unrepresentation of some categories and extremeimbalance, the dataset were transformed to have only two categories. See the paper for more details.\n\nThe dataset is splitted to train and test parts. Current count distribution is as follows:## Dataset Creation### Source Data\n\nAccounts of Russian persons on Twitter that were marked as having tendency to suicide.### Annotations\n\n\n\nSee the paper.#### Personal and Sensitive Information\n\n\n\nThe dataset may contain some personal information that was shared by Twitter users themselves.\n\nBibTeX:"
] |
828111a22e12bf4bc6086576fd63fe4984c11545 |
# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ed001/datascience-coder-6.7b](https://huggingface.co/ed001/datascience-coder-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ed001__datascience-coder-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:33:35.006022](https://huggingface.co/datasets/open-llm-leaderboard/details_ed001__datascience-coder-6.7b/blob/main/results_2024-01-05T09-33-35.006022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38026416351025355,
"acc_stderr": 0.03435235823130946,
"acc_norm": 0.38170571467795217,
"acc_norm_stderr": 0.03507989456880972,
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.44821795300523526,
"mc2_stderr": 0.01501348980684818
},
"harness|arc:challenge|25": {
"acc": 0.3430034129692833,
"acc_stderr": 0.01387242322371817,
"acc_norm": 0.3464163822525597,
"acc_norm_stderr": 0.013905011180063242
},
"harness|hellaswag|10": {
"acc": 0.41057558255327625,
"acc_stderr": 0.004909328992915071,
"acc_norm": 0.538338976299542,
"acc_norm_stderr": 0.00497509105569719
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.030783736757745647,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.030783736757745647
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.041857744240220554,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.041857744240220554
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017087,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017087
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.038154943086889305,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.038154943086889305
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.034961309720561266,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.034961309720561266
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.41450777202072536,
"acc_stderr": 0.03555300319557672,
"acc_norm": 0.41450777202072536,
"acc_norm_stderr": 0.03555300319557672
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3779816513761468,
"acc_stderr": 0.020789187066728113,
"acc_norm": 0.3779816513761468,
"acc_norm_stderr": 0.020789187066728113
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.031415546294025425,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.031415546294025425
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4050632911392405,
"acc_stderr": 0.03195514741370674,
"acc_norm": 0.4050632911392405,
"acc_norm_stderr": 0.03195514741370674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.03881891213334382,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.03881891213334382
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.03166098891888078,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.03166098891888078
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.438058748403576,
"acc_stderr": 0.017742232238257247,
"acc_norm": 0.438058748403576,
"acc_norm_stderr": 0.017742232238257247
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468648,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468648
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.028074158947600666,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.028074158947600666
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4340836012861736,
"acc_stderr": 0.02815023224453559,
"acc_norm": 0.4340836012861736,
"acc_norm_stderr": 0.02815023224453559
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.025630824975621348,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.025630824975621348
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.027968453043563168,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.027968453043563168
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27509778357235987,
"acc_stderr": 0.01140544362099692,
"acc_norm": 0.27509778357235987,
"acc_norm_stderr": 0.01140544362099692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.375,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.375,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.018690850273595287,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.018690850273595287
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.0478833976870286,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.0478833976870286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.0350808011219984,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.0350808011219984
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3567251461988304,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.3567251461988304,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2741738066095471,
"mc1_stderr": 0.015616518497219373,
"mc2": 0.44821795300523526,
"mc2_stderr": 0.01501348980684818
},
"harness|winogrande|5": {
"acc": 0.5572217837411207,
"acc_stderr": 0.013960157350784987
},
"harness|gsm8k|5": {
"acc": 0.2494313874147081,
"acc_stderr": 0.011918265218445523
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ed001__datascience-coder-6.7b | [
"region:us"
] | 2024-01-05T09:35:58+00:00 | {"pretty_name": "Evaluation run of ed001/datascience-coder-6.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ed001/datascience-coder-6.7b](https://huggingface.co/ed001/datascience-coder-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ed001__datascience-coder-6.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:33:35.006022](https://huggingface.co/datasets/open-llm-leaderboard/details_ed001__datascience-coder-6.7b/blob/main/results_2024-01-05T09-33-35.006022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38026416351025355,\n \"acc_stderr\": 0.03435235823130946,\n \"acc_norm\": 0.38170571467795217,\n \"acc_norm_stderr\": 0.03507989456880972,\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.44821795300523526,\n \"mc2_stderr\": 0.01501348980684818\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3430034129692833,\n \"acc_stderr\": 0.01387242322371817,\n \"acc_norm\": 0.3464163822525597,\n \"acc_norm_stderr\": 0.013905011180063242\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41057558255327625,\n \"acc_stderr\": 0.004909328992915071,\n \"acc_norm\": 0.538338976299542,\n \"acc_norm_stderr\": 0.00497509105569719\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.030783736757745647,\n \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.030783736757745647\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.041857744240220554,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.041857744240220554\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n \"acc_stderr\": 0.04325506042017087,\n \"acc_norm\": 0.373015873015873,\n \"acc_norm_stderr\": 0.04325506042017087\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290703,\n \"acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290703\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.038154943086889305,\n \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.038154943086889305\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.40404040404040403,\n \"acc_stderr\": 0.034961309720561266,\n \"acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.034961309720561266\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.41450777202072536,\n \"acc_stderr\": 0.03555300319557672,\n \"acc_norm\": 0.41450777202072536,\n \"acc_norm_stderr\": 0.03555300319557672\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3779816513761468,\n \"acc_stderr\": 0.020789187066728113,\n \"acc_norm\": 0.3779816513761468,\n \"acc_norm_stderr\": 0.020789187066728113\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.031415546294025425,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.031415546294025425\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.03434131164719129,\n \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.03434131164719129\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4050632911392405,\n \"acc_stderr\": 0.03195514741370674,\n \"acc_norm\": 0.4050632911392405,\n \"acc_norm_stderr\": 0.03195514741370674\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.04225875451969638,\n \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.04225875451969638\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.03881891213334382,\n \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.03881891213334382\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.03166098891888078,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.03166098891888078\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.438058748403576,\n \"acc_stderr\": 0.017742232238257247,\n \"acc_norm\": 0.438058748403576,\n \"acc_norm_stderr\": 0.017742232238257247\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.026226158605124655,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.026226158605124655\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n \"acc_stderr\": 0.014530330201468648,\n \"acc_norm\": 0.25251396648044694,\n \"acc_norm_stderr\": 0.014530330201468648\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.028074158947600666,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.028074158947600666\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4340836012861736,\n \"acc_stderr\": 0.02815023224453559,\n \"acc_norm\": 0.4340836012861736,\n \"acc_norm_stderr\": 0.02815023224453559\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.025630824975621348,\n \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.025630824975621348\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3262411347517731,\n \"acc_stderr\": 0.027968453043563168,\n \"acc_norm\": 0.3262411347517731,\n \"acc_norm_stderr\": 0.027968453043563168\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n \"acc_stderr\": 0.01140544362099692,\n \"acc_norm\": 0.27509778357235987,\n \"acc_norm_stderr\": 0.01140544362099692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3088235294117647,\n \"acc_stderr\": 0.018690850273595287,\n \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.018690850273595287\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.0478833976870286,\n \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.0478833976870286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004128,\n \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004128\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n \"acc_stderr\": 0.0350808011219984,\n \"acc_norm\": 0.43781094527363185,\n \"acc_norm_stderr\": 0.0350808011219984\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3567251461988304,\n \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.3567251461988304,\n \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2741738066095471,\n \"mc1_stderr\": 0.015616518497219373,\n \"mc2\": 0.44821795300523526,\n \"mc2_stderr\": 0.01501348980684818\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5572217837411207,\n \"acc_stderr\": 0.013960157350784987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2494313874147081,\n \"acc_stderr\": 0.011918265218445523\n }\n}\n```", "repo_url": "https://huggingface.co/ed001/datascience-coder-6.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["**/details_harness|winogrande|5_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-33-35.006022.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_33_35.006022", "path": ["results_2024-01-05T09-33-35.006022.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-33-35.006022.parquet"]}]}]} | 2024-01-05T09:36:21+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b
Dataset automatically created during the evaluation run of model ed001/datascience-coder-6.7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:33:35.006022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b\n\n\n\nDataset automatically created during the evaluation run of model ed001/datascience-coder-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:33:35.006022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b\n\n\n\nDataset automatically created during the evaluation run of model ed001/datascience-coder-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:33:35.006022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ed001/datascience-coder-6.7b\n\n\n\nDataset automatically created during the evaluation run of model ed001/datascience-coder-6.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:33:35.006022(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
83d2e3c5b405c3c01c81df5fdcc2d6a27a3a8a1a |
# Dataset Card for Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [malhajar/Mistral-7B-v0.2-meditron-turkish](https://huggingface.co/malhajar/Mistral-7B-v0.2-meditron-turkish) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_malhajar__Mistral-7B-v0.2-meditron-turkish",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:37:57.221599](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Mistral-7B-v0.2-meditron-turkish/blob/main/results_2024-01-05T09-37-57.221599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.60159226527681,
"acc_stderr": 0.033104690476384036,
"acc_norm": 0.6069622523870655,
"acc_norm_stderr": 0.03378038316382859,
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6619182579327776,
"mc2_stderr": 0.014732292169528463
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.01452398763834408,
"acc_norm": 0.5955631399317406,
"acc_norm_stderr": 0.01434203648343618
},
"harness|hellaswag|10": {
"acc": 0.6233817964548894,
"acc_stderr": 0.004835475957610925,
"acc_norm": 0.8178649671380203,
"acc_norm_stderr": 0.003851669934633879
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6935483870967742,
"acc_stderr": 0.026226485652553883,
"acc_norm": 0.6935483870967742,
"acc_norm_stderr": 0.026226485652553883
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397443,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934837,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934837
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787575,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787575
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605935,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605935
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175371,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371151,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371151
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.02524826477424284,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.02524826477424284
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.01493131670322051,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.01493131670322051
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.025910063528240875,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.025910063528240875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790205,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.01961085147488029,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.01961085147488029
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4663402692778458,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.6619182579327776,
"mc2_stderr": 0.014732292169528463
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.01196129890580315
},
"harness|gsm8k|5": {
"acc": 0.3593631539044731,
"acc_stderr": 0.01321645630985154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_malhajar__Mistral-7B-v0.2-meditron-turkish | [
"region:us"
] | 2024-01-05T09:38:54+00:00 | {"pretty_name": "Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish", "dataset_summary": "Dataset automatically created during the evaluation run of model [malhajar/Mistral-7B-v0.2-meditron-turkish](https://huggingface.co/malhajar/Mistral-7B-v0.2-meditron-turkish) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_malhajar__Mistral-7B-v0.2-meditron-turkish\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:37:57.221599](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__Mistral-7B-v0.2-meditron-turkish/blob/main/results_2024-01-05T09-37-57.221599.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.60159226527681,\n \"acc_stderr\": 0.033104690476384036,\n \"acc_norm\": 0.6069622523870655,\n \"acc_norm_stderr\": 0.03378038316382859,\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6619182579327776,\n \"mc2_stderr\": 0.014732292169528463\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.01452398763834408,\n \"acc_norm\": 0.5955631399317406,\n \"acc_norm_stderr\": 0.01434203648343618\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6233817964548894,\n \"acc_stderr\": 0.004835475957610925,\n \"acc_norm\": 0.8178649671380203,\n \"acc_norm_stderr\": 0.003851669934633879\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6935483870967742,\n \"acc_stderr\": 0.026226485652553883,\n \"acc_norm\": 0.6935483870967742,\n \"acc_norm_stderr\": 0.026226485652553883\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397443,\n \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934837,\n \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934837\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787575,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787575\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.6278026905829597,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605935,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605935\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.02336505149175371,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.02336505149175371\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n \"acc_stderr\": 0.014805384478371151,\n \"acc_norm\": 0.7803320561941252,\n \"acc_norm_stderr\": 0.014805384478371151\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.02524826477424284,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.02524826477424284\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n \"acc_stderr\": 0.01493131670322051,\n \"acc_norm\": 0.2748603351955307,\n \"acc_norm_stderr\": 0.01493131670322051\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545715,\n \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545715\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240875,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240875\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n \"acc_stderr\": 0.012643004623790205,\n \"acc_norm\": 0.42959582790091266,\n \"acc_norm_stderr\": 0.012643004623790205\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.01961085147488029,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.01961085147488029\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4663402692778458,\n \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.6619182579327776,\n \"mc2_stderr\": 0.014732292169528463\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.01196129890580315\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3593631539044731,\n \"acc_stderr\": 0.01321645630985154\n }\n}\n```", "repo_url": "https://huggingface.co/malhajar/Mistral-7B-v0.2-meditron-turkish", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-36-36.907397.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-37-57.221599.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["**/details_harness|winogrande|5_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["**/details_harness|winogrande|5_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-37-57.221599.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_36_36.907397", "path": ["results_2024-01-05T09-36-36.907397.parquet"]}, {"split": "2024_01_05T09_37_57.221599", "path": ["results_2024-01-05T09-37-57.221599.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-37-57.221599.parquet"]}]}]} | 2024-01-05T09:40:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish
Dataset automatically created during the evaluation run of model malhajar/Mistral-7B-v0.2-meditron-turkish on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:37:57.221599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish\n\n\n\nDataset automatically created during the evaluation run of model malhajar/Mistral-7B-v0.2-meditron-turkish on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:37:57.221599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish\n\n\n\nDataset automatically created during the evaluation run of model malhajar/Mistral-7B-v0.2-meditron-turkish on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:37:57.221599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of malhajar/Mistral-7B-v0.2-meditron-turkish\n\n\n\nDataset automatically created during the evaluation run of model malhajar/Mistral-7B-v0.2-meditron-turkish on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:37:57.221599(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
aa8ec1655af87c61c30cca0ccbe6c1ace0410bb1 |
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-7b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-llm-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:38:25.592014](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-7b-chat/blob/main/results_2024-01-05T10-38-25.592014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5217428488143239,
"acc_stderr": 0.03404743889765096,
"acc_norm": 0.5230686740068565,
"acc_norm_stderr": 0.0347462652663857,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.47921745550580336,
"mc2_stderr": 0.015430955018425466
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597167,
"acc_norm": 0.5571672354948806,
"acc_norm_stderr": 0.014515573873348899
},
"harness|hellaswag|10": {
"acc": 0.5957976498705437,
"acc_stderr": 0.0048973407933143795,
"acc_norm": 0.7937661820354511,
"acc_norm_stderr": 0.0040377344515556465
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5320754716981132,
"acc_stderr": 0.030709486992556545,
"acc_norm": 0.5320754716981132,
"acc_norm_stderr": 0.030709486992556545
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5709677419354838,
"acc_stderr": 0.028156036538233193,
"acc_norm": 0.5709677419354838,
"acc_norm_stderr": 0.028156036538233193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3497536945812808,
"acc_stderr": 0.03355400904969565,
"acc_norm": 0.3497536945812808,
"acc_norm_stderr": 0.03355400904969565
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.025294608023986472,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.025294608023986472
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228423,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228423
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.032437180551374095,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.032437180551374095
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.019416445892636025,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.019416445892636025
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.033188332862172806,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.033188332862172806
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.04356447202665069,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.04356447202665069
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6073619631901841,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.6073619631901841,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335442,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335442
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7305236270753512,
"acc_stderr": 0.015866243073215058,
"acc_norm": 0.7305236270753512,
"acc_norm_stderr": 0.015866243073215058
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903219,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903219
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089775,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5562700964630225,
"acc_stderr": 0.028217683556652308,
"acc_norm": 0.5562700964630225,
"acc_norm_stderr": 0.028217683556652308
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607704,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607704
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36524822695035464,
"acc_stderr": 0.028723863853281288,
"acc_norm": 0.36524822695035464,
"acc_norm_stderr": 0.028723863853281288
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4074315514993481,
"acc_stderr": 0.012549473714212226,
"acc_norm": 0.4074315514993481,
"acc_norm_stderr": 0.012549473714212226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468324,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468324
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302957,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302957
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789845,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.47921745550580336,
"mc2_stderr": 0.015430955018425466
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.01218577622051616
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429786
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_deepseek-ai__deepseek-llm-7b-chat | [
"region:us"
] | 2024-01-05T09:42:45+00:00 | {"pretty_name": "Evaluation run of deepseek-ai/deepseek-llm-7b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-llm-7b-chat](https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-llm-7b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T10:38:25.592014](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-llm-7b-chat/blob/main/results_2024-01-05T10-38-25.592014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5217428488143239,\n \"acc_stderr\": 0.03404743889765096,\n \"acc_norm\": 0.5230686740068565,\n \"acc_norm_stderr\": 0.0347462652663857,\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.47921745550580336,\n \"mc2_stderr\": 0.015430955018425466\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597167,\n \"acc_norm\": 0.5571672354948806,\n \"acc_norm_stderr\": 0.014515573873348899\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5957976498705437,\n \"acc_stderr\": 0.0048973407933143795,\n \"acc_norm\": 0.7937661820354511,\n \"acc_norm_stderr\": 0.0040377344515556465\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556545,\n \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556545\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5709677419354838,\n \"acc_stderr\": 0.028156036538233193,\n \"acc_norm\": 0.5709677419354838,\n \"acc_norm_stderr\": 0.028156036538233193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3497536945812808,\n \"acc_stderr\": 0.03355400904969565,\n \"acc_norm\": 0.3497536945812808,\n \"acc_norm_stderr\": 0.03355400904969565\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.025294608023986472,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.025294608023986472\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228423,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228423\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.032437180551374095,\n \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.032437180551374095\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n \"acc_stderr\": 0.019416445892636025,\n \"acc_norm\": 0.7119266055045872,\n \"acc_norm_stderr\": 0.019416445892636025\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n \"acc_stderr\": 0.033188332862172806,\n \"acc_norm\": 0.5739910313901345,\n \"acc_norm_stderr\": 0.033188332862172806\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.04356447202665069,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.04356447202665069\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.0432076780753667,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.0432076780753667\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6073619631901841,\n \"acc_stderr\": 0.03836740907831029,\n \"acc_norm\": 0.6073619631901841,\n \"acc_norm_stderr\": 0.03836740907831029\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.025140935950335442,\n \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.025140935950335442\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n \"acc_stderr\": 0.015866243073215058,\n \"acc_norm\": 0.7305236270753512,\n \"acc_norm_stderr\": 0.015866243073215058\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089775,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089775\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5562700964630225,\n \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.5562700964630225,\n \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607704,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607704\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.36524822695035464,\n \"acc_stderr\": 0.028723863853281288,\n \"acc_norm\": 0.36524822695035464,\n \"acc_norm_stderr\": 0.028723863853281288\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4074315514993481,\n \"acc_stderr\": 0.012549473714212226,\n \"acc_norm\": 0.4074315514993481,\n \"acc_norm_stderr\": 0.012549473714212226\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468324,\n \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468324\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302957,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302957\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789845,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789845\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.47921745550580336,\n \"mc2_stderr\": 0.015430955018425466\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.01218577622051616\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \"acc_stderr\": 0.013727093010429786\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-llm-7b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-38.856043.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-38-25.592014.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["**/details_harness|winogrande|5_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["**/details_harness|winogrande|5_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T10-38-25.592014.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_40_38.856043", "path": ["results_2024-01-05T09-40-38.856043.parquet"]}, {"split": "2024_01_05T10_38_25.592014", "path": ["results_2024-01-05T10-38-25.592014.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T10-38-25.592014.parquet"]}]}]} | 2024-01-05T10:40:58+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-7b-chat
Dataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-7b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T10:38:25.592014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:38:25.592014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:38:25.592014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepseek-ai/deepseek-llm-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-llm-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T10:38:25.592014(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
395b6d691233da9cfe76e5a1b7544f70ed18e8c0 |
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:40:26.509293](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-instruct/blob/main/results_2024-01-05T09-40-26.509293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3910854432124772,
"acc_stderr": 0.03451531253731588,
"acc_norm": 0.3927686146925743,
"acc_norm_stderr": 0.03524628661020728,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608753,
"mc2": 0.455556353489654,
"mc2_stderr": 0.015280106274741655
},
"harness|arc:challenge|25": {
"acc": 0.35921501706484643,
"acc_stderr": 0.014020224155839159,
"acc_norm": 0.38139931740614336,
"acc_norm_stderr": 0.014194389086685263
},
"harness|hellaswag|10": {
"acc": 0.420035849432384,
"acc_stderr": 0.0049255561046794094,
"acc_norm": 0.5508862776339375,
"acc_norm_stderr": 0.004963872936857944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3355263157894737,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.3355263157894737,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41132075471698115,
"acc_stderr": 0.030285009259009798,
"acc_norm": 0.41132075471698115,
"acc_norm_stderr": 0.030285009259009798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357787,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357787
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.41935483870967744,
"acc_stderr": 0.028071588901091835,
"acc_norm": 0.41935483870967744,
"acc_norm_stderr": 0.028071588901091835
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.032957975663112704,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.032957975663112704
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.37575757575757573,
"acc_stderr": 0.03781887353205983,
"acc_norm": 0.37575757575757573,
"acc_norm_stderr": 0.03781887353205983
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.03521224908841583,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.03521224908841583
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.43005181347150256,
"acc_stderr": 0.03572954333144809,
"acc_norm": 0.43005181347150256,
"acc_norm_stderr": 0.03572954333144809
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.0242831405294673,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.0242831405294673
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230165,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230165
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.41651376146788993,
"acc_stderr": 0.02113637650403087,
"acc_norm": 0.41651376146788993,
"acc_norm_stderr": 0.02113637650403087
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647207,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647207
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.03434131164719129,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.03434131164719129
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4198473282442748,
"acc_stderr": 0.04328577215262972,
"acc_norm": 0.4198473282442748,
"acc_norm_stderr": 0.04328577215262972
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.045454545454545456,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.045454545454545456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436972,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436972
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.3883495145631068,
"acc_stderr": 0.0482572933735639,
"acc_norm": 0.3883495145631068,
"acc_norm_stderr": 0.0482572933735639
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.031166957367235903,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.031166957367235903
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4367816091954023,
"acc_stderr": 0.017736470837800687,
"acc_norm": 0.4367816091954023,
"acc_norm_stderr": 0.017736470837800687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422622,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422622
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527824,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527824
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.02795604616542451,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.02795604616542451
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.42765273311897106,
"acc_stderr": 0.028099240775809567,
"acc_norm": 0.42765273311897106,
"acc_norm_stderr": 0.028099240775809567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.36419753086419754,
"acc_stderr": 0.02677492989972232,
"acc_norm": 0.36419753086419754,
"acc_norm_stderr": 0.02677492989972232
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.0278079901413202,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.0278079901413202
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2842242503259452,
"acc_stderr": 0.011519880596516078,
"acc_norm": 0.2842242503259452,
"acc_norm_stderr": 0.011519880596516078
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824855,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824855
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35130718954248363,
"acc_stderr": 0.019312676065786558,
"acc_norm": 0.35130718954248363,
"acc_norm_stderr": 0.019312676065786558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4530612244897959,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.4530612244897959,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.03660298834049162,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.03660298834049162
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608753,
"mc2": 0.455556353489654,
"mc2_stderr": 0.015280106274741655
},
"harness|winogrande|5": {
"acc": 0.5682715074980268,
"acc_stderr": 0.01392087211001071
},
"harness|gsm8k|5": {
"acc": 0.2676269901440485,
"acc_stderr": 0.012194764427053343
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-instruct | [
"region:us"
] | 2024-01-05T09:42:49+00:00 | {"pretty_name": "Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct", "dataset_summary": "Dataset automatically created during the evaluation run of model [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-instruct\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:40:26.509293](https://huggingface.co/datasets/open-llm-leaderboard/details_deepseek-ai__deepseek-coder-6.7b-instruct/blob/main/results_2024-01-05T09-40-26.509293.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3910854432124772,\n \"acc_stderr\": 0.03451531253731588,\n \"acc_norm\": 0.3927686146925743,\n \"acc_norm_stderr\": 0.03524628661020728,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608753,\n \"mc2\": 0.455556353489654,\n \"mc2_stderr\": 0.015280106274741655\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.35921501706484643,\n \"acc_stderr\": 0.014020224155839159,\n \"acc_norm\": 0.38139931740614336,\n \"acc_norm_stderr\": 0.014194389086685263\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.420035849432384,\n \"acc_stderr\": 0.0049255561046794094,\n \"acc_norm\": 0.5508862776339375,\n \"acc_norm_stderr\": 0.004963872936857944\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.41132075471698115,\n \"acc_stderr\": 0.030285009259009798,\n \"acc_norm\": 0.41132075471698115,\n \"acc_norm_stderr\": 0.030285009259009798\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357787,\n \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357787\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.41935483870967744,\n \"acc_stderr\": 0.028071588901091835,\n \"acc_norm\": 0.41935483870967744,\n \"acc_norm_stderr\": 0.028071588901091835\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.032957975663112704,\n \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.032957975663112704\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.37575757575757573,\n \"acc_stderr\": 0.03781887353205983,\n \"acc_norm\": 0.37575757575757573,\n \"acc_norm_stderr\": 0.03781887353205983\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.03521224908841583,\n \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03521224908841583\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.43005181347150256,\n \"acc_stderr\": 0.03572954333144809,\n \"acc_norm\": 0.43005181347150256,\n \"acc_norm_stderr\": 0.03572954333144809\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.0242831405294673,\n \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.0242831405294673\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230165,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230165\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.41651376146788993,\n \"acc_stderr\": 0.02113637650403087,\n \"acc_norm\": 0.41651376146788993,\n \"acc_norm_stderr\": 0.02113637650403087\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647207,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647207\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.03434131164719129,\n \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.03434131164719129\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4198473282442748,\n \"acc_stderr\": 0.04328577215262972,\n \"acc_norm\": 0.4198473282442748,\n \"acc_norm_stderr\": 0.04328577215262972\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.045454545454545456,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.045454545454545456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436972,\n \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436972\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3883495145631068,\n \"acc_stderr\": 0.0482572933735639,\n \"acc_norm\": 0.3883495145631068,\n \"acc_norm_stderr\": 0.0482572933735639\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.031166957367235903,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.031166957367235903\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4367816091954023,\n \"acc_stderr\": 0.017736470837800687,\n \"acc_norm\": 0.4367816091954023,\n \"acc_norm_stderr\": 0.017736470837800687\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422622,\n \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422622\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n \"acc_stderr\": 0.014635185616527824,\n \"acc_norm\": 0.2581005586592179,\n \"acc_norm_stderr\": 0.014635185616527824\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.02795604616542451,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.02795604616542451\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.42765273311897106,\n \"acc_stderr\": 0.028099240775809567,\n \"acc_norm\": 0.42765273311897106,\n \"acc_norm_stderr\": 0.028099240775809567\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.36419753086419754,\n \"acc_stderr\": 0.02677492989972232,\n \"acc_norm\": 0.36419753086419754,\n \"acc_norm_stderr\": 0.02677492989972232\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.0278079901413202,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.0278079901413202\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2842242503259452,\n \"acc_stderr\": 0.011519880596516078,\n \"acc_norm\": 0.2842242503259452,\n \"acc_norm_stderr\": 0.011519880596516078\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824855,\n \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824855\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.35130718954248363,\n \"acc_stderr\": 0.019312676065786558,\n \"acc_norm\": 0.35130718954248363,\n \"acc_norm_stderr\": 0.019312676065786558\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.4530612244897959,\n \"acc_stderr\": 0.03186785930004128,\n \"acc_norm\": 0.4530612244897959,\n \"acc_norm_stderr\": 0.03186785930004128\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049162,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049162\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608753,\n \"mc2\": 0.455556353489654,\n \"mc2_stderr\": 0.015280106274741655\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5682715074980268,\n \"acc_stderr\": 0.01392087211001071\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2676269901440485,\n \"acc_stderr\": 0.012194764427053343\n }\n}\n```", "repo_url": "https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-26.509293.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["**/details_harness|winogrande|5_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-40-26.509293.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_40_26.509293", "path": ["results_2024-01-05T09-40-26.509293.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-40-26.509293.parquet"]}]}]} | 2024-01-05T09:43:10+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct
Dataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-6.7b-instruct on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:40:26.509293(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-6.7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:40:26.509293(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-6.7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:40:26.509293(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of deepseek-ai/deepseek-coder-6.7b-instruct\n\n\n\nDataset automatically created during the evaluation run of model deepseek-ai/deepseek-coder-6.7b-instruct on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:40:26.509293(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
1bf1ee52f6331464fc0449b4866db790a3a8c100 |
# Dataset Card for Evaluation run of allbyai/ToRoLaMa-7b-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allbyai/ToRoLaMa-7b-v1.0](https://huggingface.co/allbyai/ToRoLaMa-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T09:43:59.013115](https://huggingface.co/datasets/open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0/blob/main/results_2024-01-05T09-43-59.013115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45235843004419446,
"acc_stderr": 0.03423154255354607,
"acc_norm": 0.45929415154501946,
"acc_norm_stderr": 0.035110482824261206,
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386844,
"mc2": 0.44894454656581184,
"mc2_stderr": 0.015890874190577126
},
"harness|arc:challenge|25": {
"acc": 0.47013651877133106,
"acc_stderr": 0.014585305840007104,
"acc_norm": 0.5170648464163823,
"acc_norm_stderr": 0.0146028783885366
},
"harness|hellaswag|10": {
"acc": 0.566122286397132,
"acc_stderr": 0.004945956744943814,
"acc_norm": 0.7381995618402709,
"acc_norm_stderr": 0.00438716120308797
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.45394736842105265,
"acc_stderr": 0.04051646342874143,
"acc_norm": 0.45394736842105265,
"acc_norm_stderr": 0.04051646342874143
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389188,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389188
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278006,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278006
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.023577604791655826,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.023577604791655826
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04006168083848877,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04006168083848877
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5258064516129032,
"acc_stderr": 0.028406095057653326,
"acc_norm": 0.5258064516129032,
"acc_norm_stderr": 0.028406095057653326
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0317852971064275,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0317852971064275
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.0381549430868893,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.0381549430868893
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.035212249088415845,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.035212249088415845
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414358,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414358
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3769230769230769,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.3769230769230769,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42436974789915966,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.42436974789915966,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5486238532110091,
"acc_stderr": 0.021335714711268786,
"acc_norm": 0.5486238532110091,
"acc_norm_stderr": 0.021335714711268786
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239172,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.033408675019233246,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.033408675019233246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.04382094705550988,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.04382094705550988
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041695,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041695
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.03023638994217309,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.03023638994217309
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6079182630906769,
"acc_stderr": 0.017458524050147636,
"acc_norm": 0.6079182630906769,
"acc_norm_stderr": 0.017458524050147636
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5,
"acc_stderr": 0.026919095102908273,
"acc_norm": 0.5,
"acc_norm_stderr": 0.026919095102908273
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.01516654455049031,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.01516654455049031
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5032679738562091,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.5032679738562091,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.02779476010500874,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.02779476010500874
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.34397163120567376,
"acc_stderr": 0.028338017428611324,
"acc_norm": 0.34397163120567376,
"acc_norm_stderr": 0.028338017428611324
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32985658409387225,
"acc_stderr": 0.012008129938540469,
"acc_norm": 0.32985658409387225,
"acc_norm_stderr": 0.012008129938540469
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.029520095697687758,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.029520095697687758
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.02008736207670286,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.02008736207670286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5102040816326531,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.5102040816326531,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.03400598505599014,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.03400598505599014
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30354957160342716,
"mc1_stderr": 0.016095884155386844,
"mc2": 0.44894454656581184,
"mc2_stderr": 0.015890874190577126
},
"harness|winogrande|5": {
"acc": 0.7008681925808997,
"acc_stderr": 0.012868639066091533
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0 | [
"region:us"
] | 2024-01-05T09:46:25+00:00 | {"pretty_name": "Evaluation run of allbyai/ToRoLaMa-7b-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [allbyai/ToRoLaMa-7b-v1.0](https://huggingface.co/allbyai/ToRoLaMa-7b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T09:43:59.013115](https://huggingface.co/datasets/open-llm-leaderboard/details_allbyai__ToRoLaMa-7b-v1.0/blob/main/results_2024-01-05T09-43-59.013115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45235843004419446,\n \"acc_stderr\": 0.03423154255354607,\n \"acc_norm\": 0.45929415154501946,\n \"acc_norm_stderr\": 0.035110482824261206,\n \"mc1\": 0.30354957160342716,\n \"mc1_stderr\": 0.016095884155386844,\n \"mc2\": 0.44894454656581184,\n \"mc2_stderr\": 0.015890874190577126\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47013651877133106,\n \"acc_stderr\": 0.014585305840007104,\n \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.0146028783885366\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.566122286397132,\n \"acc_stderr\": 0.004945956744943814,\n \"acc_norm\": 0.7381995618402709,\n \"acc_norm_stderr\": 0.00438716120308797\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.45394736842105265,\n \"acc_stderr\": 0.04051646342874143,\n \"acc_norm\": 0.45394736842105265,\n \"acc_norm_stderr\": 0.04051646342874143\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389188,\n \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389188\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278006,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278006\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.023577604791655826,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.023577604791655826\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04006168083848877,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04006168083848877\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5258064516129032,\n \"acc_stderr\": 0.028406095057653326,\n \"acc_norm\": 0.5258064516129032,\n \"acc_norm_stderr\": 0.028406095057653326\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0317852971064275,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0317852971064275\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.0381549430868893,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.0381549430868893\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.035212249088415845,\n \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.035212249088415845\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414358,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414358\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3769230769230769,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.3769230769230769,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42436974789915966,\n \"acc_stderr\": 0.03210479051015776,\n \"acc_norm\": 0.42436974789915966,\n \"acc_norm_stderr\": 0.03210479051015776\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5486238532110091,\n \"acc_stderr\": 0.021335714711268786,\n \"acc_norm\": 0.5486238532110091,\n \"acc_norm_stderr\": 0.021335714711268786\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.547085201793722,\n \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.04382094705550988,\n \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.04382094705550988\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041695,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041695\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.03023638994217309,\n \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.03023638994217309\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6079182630906769,\n \"acc_stderr\": 0.017458524050147636,\n \"acc_norm\": 0.6079182630906769,\n \"acc_norm_stderr\": 0.017458524050147636\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.026919095102908273,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.026919095102908273\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n \"acc_stderr\": 0.01516654455049031,\n \"acc_norm\": 0.28938547486033517,\n \"acc_norm_stderr\": 0.01516654455049031\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5032679738562091,\n \"acc_stderr\": 0.028629305194003543,\n \"acc_norm\": 0.5032679738562091,\n \"acc_norm_stderr\": 0.028629305194003543\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.02832032583010591,\n \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.02832032583010591\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4783950617283951,\n \"acc_stderr\": 0.02779476010500874,\n \"acc_norm\": 0.4783950617283951,\n \"acc_norm_stderr\": 0.02779476010500874\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.34397163120567376,\n \"acc_stderr\": 0.028338017428611324,\n \"acc_norm\": 0.34397163120567376,\n \"acc_norm_stderr\": 0.028338017428611324\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32985658409387225,\n \"acc_stderr\": 0.012008129938540469,\n \"acc_norm\": 0.32985658409387225,\n \"acc_norm_stderr\": 0.012008129938540469\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.029520095697687758,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.029520095697687758\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.02008736207670286,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.02008736207670286\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893782,\n \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893782\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.03400598505599014,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.03400598505599014\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30354957160342716,\n \"mc1_stderr\": 0.016095884155386844,\n \"mc2\": 0.44894454656581184,\n \"mc2_stderr\": 0.015890874190577126\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7008681925808997,\n \"acc_stderr\": 0.012868639066091533\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.003195747075480772\n }\n}\n```", "repo_url": "https://huggingface.co/allbyai/ToRoLaMa-7b-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T09-43-59.013115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["**/details_harness|winogrande|5_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T09-43-59.013115.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T09_43_59.013115", "path": ["results_2024-01-05T09-43-59.013115.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T09-43-59.013115.parquet"]}]}]} | 2024-01-05T09:46:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of allbyai/ToRoLaMa-7b-v1.0
Dataset automatically created during the evaluation run of model allbyai/ToRoLaMa-7b-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T09:43:59.013115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of allbyai/ToRoLaMa-7b-v1.0\n\n\n\nDataset automatically created during the evaluation run of model allbyai/ToRoLaMa-7b-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:43:59.013115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of allbyai/ToRoLaMa-7b-v1.0\n\n\n\nDataset automatically created during the evaluation run of model allbyai/ToRoLaMa-7b-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T09:43:59.013115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of allbyai/ToRoLaMa-7b-v1.0\n\n\n\nDataset automatically created during the evaluation run of model allbyai/ToRoLaMa-7b-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T09:43:59.013115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9b30f897534ac42e5afef492e4aa9488b1cc2b6a |
This dataset is a tiny subset of [Intel/orca_dpo_pairs](https://huggingface.co/datasets/Intel/orca_dpo_pairs), used for internal testing. | BramVanroy/test-dataset-dont-delete | [
"region:us"
] | 2024-01-05T09:51:47+00:00 | {"dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8856, "num_examples": 4}], "download_size": 25365, "dataset_size": 8856}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-05T09:58:05+00:00 | [] | [] | TAGS
#region-us
|
This dataset is a tiny subset of Intel/orca_dpo_pairs, used for internal testing. | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
225a8cdbc955f3d75c48feac26b449699526f533 |
# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:04:28.728094](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2/blob/main/results_2024-01-05T10-04-28.728094.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6213400817324534,
"acc_stderr": 0.03263383145717375,
"acc_norm": 0.6264588975889877,
"acc_norm_stderr": 0.03329437822786877,
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.46376151030867085,
"mc2_stderr": 0.01457773326521732
},
"harness|arc:challenge|25": {
"acc": 0.5597269624573379,
"acc_stderr": 0.014506769524804232,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938165
},
"harness|hellaswag|10": {
"acc": 0.6201951802429795,
"acc_stderr": 0.0048434625459435,
"acc_norm": 0.8206532563234415,
"acc_norm_stderr": 0.0038285834080213836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239976,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239976
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.02423353229775873,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.02423353229775873
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908237,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908237
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.03128217706368461,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.03128217706368461
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295838,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295838
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597524,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597524
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2860335195530726,
"acc_stderr": 0.015113972129062129,
"acc_norm": 0.2860335195530726,
"acc_norm_stderr": 0.015113972129062129
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.025773111169630453,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.025773111169630453
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504519,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504519
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717206,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3084455324357405,
"mc1_stderr": 0.01616803938315687,
"mc2": 0.46376151030867085,
"mc2_stderr": 0.01457773326521732
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 | [
"region:us"
] | 2024-01-05T10:06:46+00:00 | {"pretty_name": "Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2](https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T10:04:28.728094](https://huggingface.co/datasets/open-llm-leaderboard/details_NickyNicky__Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2/blob/main/results_2024-01-05T10-04-28.728094.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6213400817324534,\n \"acc_stderr\": 0.03263383145717375,\n \"acc_norm\": 0.6264588975889877,\n \"acc_norm_stderr\": 0.03329437822786877,\n \"mc1\": 0.3084455324357405,\n \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.46376151030867085,\n \"mc2_stderr\": 0.01457773326521732\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5597269624573379,\n \"acc_stderr\": 0.014506769524804232,\n \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938165\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6201951802429795,\n \"acc_stderr\": 0.0048434625459435,\n \"acc_norm\": 0.8206532563234415,\n \"acc_norm_stderr\": 0.0038285834080213836\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n \"acc_stderr\": 0.024685979286239976,\n \"acc_norm\": 0.7483870967741936,\n \"acc_norm_stderr\": 0.024685979286239976\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.02423353229775873,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.02423353229775873\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908237,\n \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908237\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.03128217706368461,\n \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.03128217706368461\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295838,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295838\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597524,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597524\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n \"acc_stderr\": 0.014000791294407,\n \"acc_norm\": 0.8109833971902938,\n \"acc_norm_stderr\": 0.014000791294407\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2860335195530726,\n \"acc_stderr\": 0.015113972129062129,\n \"acc_norm\": 0.2860335195530726,\n \"acc_norm_stderr\": 0.015113972129062129\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.025773111169630453,\n \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.025773111169630453\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.012731102790504519,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.012731102790504519\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717206,\n \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717206\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3084455324357405,\n \"mc1_stderr\": 0.01616803938315687,\n \"mc2\": 0.46376151030867085,\n \"mc2_stderr\": 0.01457773326521732\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \"acc_stderr\": 0.013504357787494039\n }\n}\n```", "repo_url": "https://huggingface.co/NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-28.728094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["**/details_harness|winogrande|5_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T10-04-28.728094.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T10_04_28.728094", "path": ["results_2024-01-05T10-04-28.728094.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T10-04-28.728094.parquet"]}]}]} | 2024-01-05T10:07:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2
Dataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T10:04:28.728094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2\n\n\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:28.728094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2\n\n\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:28.728094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
217,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2\n\n\n\nDataset automatically created during the evaluation run of model NickyNicky/Mistral-7B-OpenOrca-oasst_top1_2023-08-25-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:28.728094(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:"
] |
39ba8a13d27f45284bb6396f7f27f80f53d874c3 |
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-zephyr-7b-v14.1](https://huggingface.co/OpenBuddy/openbuddy-zephyr-7b-v14.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-zephyr-7b-v14.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:04:44.823232](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-zephyr-7b-v14.1/blob/main/results_2024-01-05T10-04-44.823232.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5549728645818907,
"acc_stderr": 0.03373571403724425,
"acc_norm": 0.564574205012897,
"acc_norm_stderr": 0.034558950447180874,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125146,
"mc2": 0.4983560382203712,
"mc2_stderr": 0.015025857208173432
},
"harness|arc:challenge|25": {
"acc": 0.47525597269624575,
"acc_stderr": 0.01459348769493774,
"acc_norm": 0.5213310580204779,
"acc_norm_stderr": 0.014598087973127108
},
"harness|hellaswag|10": {
"acc": 0.5603465445130452,
"acc_stderr": 0.004953305461311754,
"acc_norm": 0.7502489543915555,
"acc_norm_stderr": 0.004319842107724388
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.040260970832965634,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.040260970832965634
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.02497695405315524,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.02497695405315524
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415895,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533087,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533087
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.032834720561085606,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.032834720561085606
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635464,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635464
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.719029374201788,
"acc_stderr": 0.01607312785122124,
"acc_norm": 0.719029374201788,
"acc_norm_stderr": 0.01607312785122124
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098174,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484627,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484627
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327684,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940992,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940992
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42046936114732725,
"acc_stderr": 0.012607654553832703,
"acc_norm": 0.42046936114732725,
"acc_norm_stderr": 0.012607654553832703
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484375,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484375
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.016656997109125146,
"mc2": 0.4983560382203712,
"mc2_stderr": 0.015025857208173432
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893009
},
"harness|gsm8k|5": {
"acc": 0.04700530705079606,
"acc_stderr": 0.005829898355937175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_OpenBuddy__openbuddy-zephyr-7b-v14.1 | [
"region:us"
] | 2024-01-05T10:07:02+00:00 | {"pretty_name": "Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-zephyr-7b-v14.1](https://huggingface.co/OpenBuddy/openbuddy-zephyr-7b-v14.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-zephyr-7b-v14.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T10:04:44.823232](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-zephyr-7b-v14.1/blob/main/results_2024-01-05T10-04-44.823232.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5549728645818907,\n \"acc_stderr\": 0.03373571403724425,\n \"acc_norm\": 0.564574205012897,\n \"acc_norm_stderr\": 0.034558950447180874,\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.016656997109125146,\n \"mc2\": 0.4983560382203712,\n \"mc2_stderr\": 0.015025857208173432\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47525597269624575,\n \"acc_stderr\": 0.01459348769493774,\n \"acc_norm\": 0.5213310580204779,\n \"acc_norm_stderr\": 0.014598087973127108\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5603465445130452,\n \"acc_stderr\": 0.004953305461311754,\n \"acc_norm\": 0.7502489543915555,\n \"acc_norm_stderr\": 0.004319842107724388\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.040260970832965634,\n \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.040260970832965634\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.032469569197899575,\n \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.032469569197899575\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04644602091222318,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04644602091222318\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.02497695405315524,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.02497695405315524\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.034991131376767445,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.034991131376767445\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533087,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533087\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.032834720561085606,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.032834720561085606\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635464,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635464\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.719029374201788,\n \"acc_stderr\": 0.01607312785122124,\n \"acc_norm\": 0.719029374201788,\n \"acc_norm_stderr\": 0.01607312785122124\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098174,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n \"acc_stderr\": 0.027917050748484627,\n \"acc_norm\": 0.5916398713826366,\n \"acc_norm_stderr\": 0.027917050748484627\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327684,\n \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940992,\n \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940992\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42046936114732725,\n \"acc_stderr\": 0.012607654553832703,\n \"acc_norm\": 0.42046936114732725,\n \"acc_norm_stderr\": 0.012607654553832703\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484375,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484375\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n \"mc1_stderr\": 0.016656997109125146,\n \"mc2\": 0.4983560382203712,\n \"mc2_stderr\": 0.015025857208173432\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04700530705079606,\n \"acc_stderr\": 0.005829898355937175\n }\n}\n```", "repo_url": "https://huggingface.co/OpenBuddy/openbuddy-zephyr-7b-v14.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-44.823232.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["**/details_harness|winogrande|5_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T10-04-44.823232.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T10_04_44.823232", "path": ["results_2024-01-05T10-04-44.823232.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T10-04-44.823232.parquet"]}]}]} | 2024-01-05T10:07:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1
Dataset automatically created during the evaluation run of model OpenBuddy/openbuddy-zephyr-7b-v14.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T10:04:44.823232(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-zephyr-7b-v14.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:44.823232(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-zephyr-7b-v14.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:44.823232(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenBuddy/openbuddy-zephyr-7b-v14.1\n\n\n\nDataset automatically created during the evaluation run of model OpenBuddy/openbuddy-zephyr-7b-v14.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T10:04:44.823232(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
b7f350e4c487afb8301d8c4852e8b0227242c5d6 | # Dataset Card for "rmh_subset_medium2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | thorirhrafn/rmh_subset_medium2 | [
"region:us"
] | 2024-01-05T10:11:52+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}, {"split": "valid", "path": "data/valid-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 775259169, "num_examples": 282160}, {"name": "test", "num_bytes": 4398683, "num_examples": 2000}, {"name": "valid", "num_bytes": 4543850, "num_examples": 2000}], "download_size": 480237633, "dataset_size": 784201702}} | 2024-01-05T10:12:26+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "rmh_subset_medium2"
More Information needed | [
"# Dataset Card for \"rmh_subset_medium2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"rmh_subset_medium2\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"rmh_subset_medium2\"\n\nMore Information needed"
] |
c3b2f0939823bae0be468c22a0d31b36161b989a |
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.04](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:12:55.038964](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04/blob/main/results_2024-01-05T10-12-55.038964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6117535002239652,
"acc_stderr": 0.03305067073551487,
"acc_norm": 0.6144598700035333,
"acc_norm_stderr": 0.033716352758886466,
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6775807253391397,
"mc2_stderr": 0.014911725947999506
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.0141696645203031,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.004734972668299617,
"acc_norm": 0.8570005974905397,
"acc_norm_stderr": 0.0034935679140933006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370333,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067887,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5903225806451613,
"acc_stderr": 0.027976054915347357,
"acc_norm": 0.5903225806451613,
"acc_norm_stderr": 0.027976054915347357
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306422,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306422
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7981651376146789,
"acc_stderr": 0.017208579357787586,
"acc_norm": 0.7981651376146789,
"acc_norm_stderr": 0.017208579357787586
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159253,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082393,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082393
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729245,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729245
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281344,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281344
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851486,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851486
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066385,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066385
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6775807253391397,
"mc2_stderr": 0.014911725947999506
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.01146204641971069
},
"harness|gsm8k|5": {
"acc": 0.514783927217589,
"acc_stderr": 0.0137664630507876
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04 | [
"region:us"
] | 2024-01-05T10:15:13+00:00 | {"pretty_name": "Evaluation run of perlthoughts/Chupacabra-7B-v2.04", "dataset_summary": "Dataset automatically created during the evaluation run of model [perlthoughts/Chupacabra-7B-v2.04](https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T10:12:55.038964](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Chupacabra-7B-v2.04/blob/main/results_2024-01-05T10-12-55.038964.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6117535002239652,\n \"acc_stderr\": 0.03305067073551487,\n \"acc_norm\": 0.6144598700035333,\n \"acc_norm_stderr\": 0.033716352758886466,\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6775807253391397,\n \"mc2_stderr\": 0.014911725947999506\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.0141696645203031,\n \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n \"acc_stderr\": 0.004734972668299617,\n \"acc_norm\": 0.8570005974905397,\n \"acc_norm_stderr\": 0.0034935679140933006\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370333,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370333\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067887,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067887\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5903225806451613,\n \"acc_stderr\": 0.027976054915347357,\n \"acc_norm\": 0.5903225806451613,\n \"acc_norm_stderr\": 0.027976054915347357\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885416,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885416\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7981651376146789,\n \"acc_stderr\": 0.017208579357787586,\n \"acc_norm\": 0.7981651376146789,\n \"acc_norm_stderr\": 0.017208579357787586\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159253,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159253\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082393,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082393\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729245,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729245\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281344,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281344\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.012704030518851486,\n \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.012704030518851486\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066385,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066385\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6775807253391397,\n \"mc2_stderr\": 0.014911725947999506\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.01146204641971069\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.514783927217589,\n \"acc_stderr\": 0.0137664630507876\n }\n}\n```", "repo_url": "https://huggingface.co/perlthoughts/Chupacabra-7B-v2.04", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["**/details_harness|winogrande|5_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T10-12-55.038964.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T10_12_55.038964", "path": ["results_2024-01-05T10-12-55.038964.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T10-12-55.038964.parquet"]}]}]} | 2024-01-05T10:15:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04
Dataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.04 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T10:12:55.038964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.04 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:12:55.038964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.04 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:12:55.038964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of perlthoughts/Chupacabra-7B-v2.04\n\n\n\nDataset automatically created during the evaluation run of model perlthoughts/Chupacabra-7B-v2.04 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T10:12:55.038964(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
5cd11aac3765deded3db796f7602706face273fd | # Dataset Card for "cruciverba"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mii-llm/cruciverba | [
"region:us"
] | 2024-01-05T10:21:11+00:00 | {"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6790785, "num_examples": 41793}], "download_size": 2407727, "dataset_size": 6790785}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-05T10:21:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "cruciverba"
More Information needed | [
"# Dataset Card for \"cruciverba\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"cruciverba\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"cruciverba\"\n\nMore Information needed"
] |
7df8b9ce98a74621f8d19c51aeee5316493c74c6 |
# Dataset Card for Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [whiterabbitneo/WhiteRabbitNeo-13B](https://huggingface.co/whiterabbitneo/WhiteRabbitNeo-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_whiterabbitneo__WhiteRabbitNeo-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T10:34:55.691217](https://huggingface.co/datasets/open-llm-leaderboard/details_whiterabbitneo__WhiteRabbitNeo-13B/blob/main/results_2024-01-05T10-34-55.691217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4325743019051002,
"acc_stderr": 0.03450564854492944,
"acc_norm": 0.4356434201033021,
"acc_norm_stderr": 0.03525272782306864,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44577231939553535,
"mc2_stderr": 0.014884190006288057
},
"harness|arc:challenge|25": {
"acc": 0.4462457337883959,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.4854948805460751,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5126468830910177,
"acc_stderr": 0.0049881849883452855,
"acc_norm": 0.6870145389364668,
"acc_norm_stderr": 0.004627607991626908
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720386,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720386
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.03053333843046751,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.03053333843046751
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3958333333333333,
"acc_stderr": 0.04089465449325582,
"acc_norm": 0.3958333333333333,
"acc_norm_stderr": 0.04089465449325582
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.037038511930995194,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.037038511930995194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.29894179894179895,
"acc_stderr": 0.02357760479165581,
"acc_norm": 0.29894179894179895,
"acc_norm_stderr": 0.02357760479165581
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3774193548387097,
"acc_stderr": 0.027575960723278236,
"acc_norm": 0.3774193548387097,
"acc_norm_stderr": 0.027575960723278236
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.03851716319398395,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.03851716319398395
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.035476014940069384,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.035476014940069384
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.544954128440367,
"acc_stderr": 0.02135050309092517,
"acc_norm": 0.544954128440367,
"acc_norm_stderr": 0.02135050309092517
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.032568505702936464,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.032568505702936464
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.03418931233833343,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.03418931233833343
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3969465648854962,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.3969465648854962,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49693251533742333,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.49693251533742333,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5466155810983397,
"acc_stderr": 0.0178020871358503,
"acc_norm": 0.5466155810983397,
"acc_norm_stderr": 0.0178020871358503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475353,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.028275490156791434,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.028275490156791434
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4662379421221865,
"acc_stderr": 0.028333277109562783,
"acc_norm": 0.4662379421221865,
"acc_norm_stderr": 0.028333277109562783
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323674,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3285528031290743,
"acc_stderr": 0.01199602724750291,
"acc_norm": 0.3285528031290743,
"acc_norm_stderr": 0.01199602724750291
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.019751726508762626,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.019751726508762626
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004129,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.48756218905472637,
"acc_stderr": 0.0353443984853958,
"acc_norm": 0.48756218905472637,
"acc_norm_stderr": 0.0353443984853958
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.03829509868994727,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.03829509868994727
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44577231939553535,
"mc2_stderr": 0.014884190006288057
},
"harness|winogrande|5": {
"acc": 0.6740331491712708,
"acc_stderr": 0.013173782636922187
},
"harness|gsm8k|5": {
"acc": 0.22365428354814254,
"acc_stderr": 0.011477795578836105
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_whiterabbitneo__WhiteRabbitNeo-13B | [
"region:us"
] | 2024-01-05T10:37:18+00:00 | {"pretty_name": "Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [whiterabbitneo/WhiteRabbitNeo-13B](https://huggingface.co/whiterabbitneo/WhiteRabbitNeo-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_whiterabbitneo__WhiteRabbitNeo-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T10:34:55.691217](https://huggingface.co/datasets/open-llm-leaderboard/details_whiterabbitneo__WhiteRabbitNeo-13B/blob/main/results_2024-01-05T10-34-55.691217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4325743019051002,\n \"acc_stderr\": 0.03450564854492944,\n \"acc_norm\": 0.4356434201033021,\n \"acc_norm_stderr\": 0.03525272782306864,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4462457337883959,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.4854948805460751,\n \"acc_norm_stderr\": 0.014605241081370056\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5126468830910177,\n \"acc_stderr\": 0.0049881849883452855,\n \"acc_norm\": 0.6870145389364668,\n \"acc_norm_stderr\": 0.004627607991626908\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n \"acc_stderr\": 0.04135176749720386,\n \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.04135176749720386\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3958333333333333,\n \"acc_stderr\": 0.04089465449325582,\n \"acc_norm\": 0.3958333333333333,\n \"acc_norm_stderr\": 0.04089465449325582\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n \"acc_stderr\": 0.037038511930995194,\n \"acc_norm\": 0.3815028901734104,\n \"acc_norm_stderr\": 0.037038511930995194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.29894179894179895,\n \"acc_stderr\": 0.02357760479165581,\n \"acc_norm\": 0.29894179894179895,\n \"acc_norm_stderr\": 0.02357760479165581\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3774193548387097,\n \"acc_stderr\": 0.027575960723278236,\n \"acc_norm\": 0.3774193548387097,\n \"acc_norm_stderr\": 0.027575960723278236\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.03851716319398395,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.03851716319398395\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.035476014940069384,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.035476014940069384\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.544954128440367,\n \"acc_stderr\": 0.02135050309092517,\n \"acc_norm\": 0.544954128440367,\n \"acc_norm_stderr\": 0.02135050309092517\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.032568505702936464,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.032568505702936464\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6127450980392157,\n \"acc_stderr\": 0.03418931233833343,\n \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.03418931233833343\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3969465648854962,\n \"acc_stderr\": 0.04291135671009224,\n \"acc_norm\": 0.3969465648854962,\n \"acc_norm_stderr\": 0.04291135671009224\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.49693251533742333,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.49693251533742333,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.049505043821289195,\n \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.049505043821289195\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5466155810983397,\n \"acc_stderr\": 0.0178020871358503,\n \"acc_norm\": 0.5466155810983397,\n \"acc_norm_stderr\": 0.0178020871358503\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n \"acc_stderr\": 0.014950103002475353,\n \"acc_norm\": 0.2759776536312849,\n \"acc_norm_stderr\": 0.014950103002475353\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.028275490156791434,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.028275490156791434\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4662379421221865,\n \"acc_stderr\": 0.028333277109562783,\n \"acc_norm\": 0.4662379421221865,\n \"acc_norm_stderr\": 0.028333277109562783\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323674,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323674\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3285528031290743,\n \"acc_stderr\": 0.01199602724750291,\n \"acc_norm\": 0.3285528031290743,\n \"acc_norm_stderr\": 0.01199602724750291\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.019751726508762626,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.019751726508762626\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004129,\n \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.48756218905472637,\n \"acc_stderr\": 0.0353443984853958,\n \"acc_norm\": 0.48756218905472637,\n \"acc_norm_stderr\": 0.0353443984853958\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.03829509868994727,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.03829509868994727\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44577231939553535,\n \"mc2_stderr\": 0.014884190006288057\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6740331491712708,\n \"acc_stderr\": 0.013173782636922187\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22365428354814254,\n \"acc_stderr\": 0.011477795578836105\n }\n}\n```", "repo_url": "https://huggingface.co/whiterabbitneo/WhiteRabbitNeo-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T10-34-55.691217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["**/details_harness|winogrande|5_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T10-34-55.691217.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T10_34_55.691217", "path": ["results_2024-01-05T10-34-55.691217.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T10-34-55.691217.parquet"]}]}]} | 2024-01-05T10:37:41+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B
Dataset automatically created during the evaluation run of model whiterabbitneo/WhiteRabbitNeo-13B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T10:34:55.691217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B\n\n\n\nDataset automatically created during the evaluation run of model whiterabbitneo/WhiteRabbitNeo-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:34:55.691217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B\n\n\n\nDataset automatically created during the evaluation run of model whiterabbitneo/WhiteRabbitNeo-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T10:34:55.691217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of whiterabbitneo/WhiteRabbitNeo-13B\n\n\n\nDataset automatically created during the evaluation run of model whiterabbitneo/WhiteRabbitNeo-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T10:34:55.691217(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d7d819f482d30923440afc8fa1ee6b1ae3e7565c |
# Dataset Card for "UltraChat-Mixin"
# UltraChat-Mixin Dataset
## Overview
llama 2 prompted style frin
### ChatMatic
ChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:
"oasst2"
"ise-uiuc/Magicoder-Evol-Instruct-110K"
"vicgalle/alpaca-gpt4"
"LDJnr/Capybara"
## Dataset Configuration
The dataset is configured as follows:
```yaml
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 168057889
num_examples: 69765
download_size: 88116993
dataset_size: 168057889
```
## Features
The UltraChat-Mixin dataset consists of the following features:
- **prompt**: A sequence of strings representing the conversation dialog Llama2 Style prompts.
## Splits
The dataset contains a single split:
- **train**: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes.
## Download Size
The download size of the UltraChat-Mixin dataset is approximately 88,116,993 bytes.
## Dataset Size
The total size of the UltraChat-Mixin dataset is approximately 168,057,889 bytes.
Please note that the dataset configuration and statistics provided above are based on the information provided by Erfan zare chavoshi.
| erfanzar/LinguaMatic-Mixin | [
"task_categories:text-generation",
"task_categories:text-classification",
"task_categories:conversational",
"size_categories:1M<n<10M",
"language:en",
"language:es",
"language:ru",
"language:de",
"language:pl",
"language:th",
"language:vi",
"language:sv",
"language:bn",
"language:da",
"language:he",
"language:it",
"language:fa",
"language:sk",
"language:id",
"language:nb",
"language:el",
"language:nl",
"language:hu",
"language:eu",
"language:zh",
"language:eo",
"language:ja",
"language:ca",
"language:cs",
"language:bg",
"language:fi",
"language:pt",
"language:tr",
"language:ro",
"language:ar",
"language:uk",
"language:gl",
"language:fr",
"language:ko",
"code",
"biology",
"medical",
"region:us"
] | 2024-01-05T11:07:52+00:00 | {"language": ["en", "es", "ru", "de", "pl", "th", "vi", "sv", "bn", "da", "he", "it", "fa", "sk", "id", "nb", "el", "nl", "hu", "eu", "zh", "eo", "ja", "ca", "cs", "bg", "fi", "pt", "tr", "ro", "ar", "uk", "gl", "fr", "ko"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "text-classification", "conversational"], "dataset_info": {"features": [{"name": "prompt", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 168057889, "num_examples": 69765}], "download_size": 88116993, "dataset_size": 168057889}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code", "biology", "medical"]} | 2024-01-06T10:27:27+00:00 | [] | [
"en",
"es",
"ru",
"de",
"pl",
"th",
"vi",
"sv",
"bn",
"da",
"he",
"it",
"fa",
"sk",
"id",
"nb",
"el",
"nl",
"hu",
"eu",
"zh",
"eo",
"ja",
"ca",
"cs",
"bg",
"fi",
"pt",
"tr",
"ro",
"ar",
"uk",
"gl",
"fr",
"ko"
] | TAGS
#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us
|
# Dataset Card for "UltraChat-Mixin"
# UltraChat-Mixin Dataset
## Overview
llama 2 prompted style frin
### ChatMatic
ChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:
"oasst2"
"ise-uiuc/Magicoder-Evol-Instruct-110K"
"vicgalle/alpaca-gpt4"
"LDJnr/Capybara"
## Dataset Configuration
The dataset is configured as follows:
## Features
The UltraChat-Mixin dataset consists of the following features:
- prompt: A sequence of strings representing the conversation dialog Llama2 Style prompts.
## Splits
The dataset contains a single split:
- train: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes.
## Download Size
The download size of the UltraChat-Mixin dataset is approximately 88,116,993 bytes.
## Dataset Size
The total size of the UltraChat-Mixin dataset is approximately 168,057,889 bytes.
Please note that the dataset configuration and statistics provided above are based on the information provided by Erfan zare chavoshi.
| [
"# Dataset Card for \"UltraChat-Mixin\"",
"# UltraChat-Mixin Dataset",
"## Overview\nllama 2 prompted style frin",
"### ChatMatic\nChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:\n\n\"oasst2\"\n\"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n\"vicgalle/alpaca-gpt4\"\n\"LDJnr/Capybara\"",
"## Dataset Configuration\nThe dataset is configured as follows:",
"## Features\nThe UltraChat-Mixin dataset consists of the following features:\n\n- prompt: A sequence of strings representing the conversation dialog Llama2 Style prompts.",
"## Splits\nThe dataset contains a single split:\n\n- train: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes.",
"## Download Size\nThe download size of the UltraChat-Mixin dataset is approximately 88,116,993 bytes.",
"## Dataset Size\nThe total size of the UltraChat-Mixin dataset is approximately 168,057,889 bytes.\n\nPlease note that the dataset configuration and statistics provided above are based on the information provided by Erfan zare chavoshi."
] | [
"TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us \n",
"# Dataset Card for \"UltraChat-Mixin\"",
"# UltraChat-Mixin Dataset",
"## Overview\nllama 2 prompted style frin",
"### ChatMatic\nChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:\n\n\"oasst2\"\n\"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n\"vicgalle/alpaca-gpt4\"\n\"LDJnr/Capybara\"",
"## Dataset Configuration\nThe dataset is configured as follows:",
"## Features\nThe UltraChat-Mixin dataset consists of the following features:\n\n- prompt: A sequence of strings representing the conversation dialog Llama2 Style prompts.",
"## Splits\nThe dataset contains a single split:\n\n- train: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes.",
"## Download Size\nThe download size of the UltraChat-Mixin dataset is approximately 88,116,993 bytes.",
"## Dataset Size\nThe total size of the UltraChat-Mixin dataset is approximately 168,057,889 bytes.\n\nPlease note that the dataset configuration and statistics provided above are based on the information provided by Erfan zare chavoshi."
] | [
253,
13,
8,
10,
111,
16,
40,
49,
25,
53
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us \n# Dataset Card for \"UltraChat-Mixin\"# UltraChat-Mixin Dataset## Overview\nllama 2 prompted style frin### ChatMatic\nChatMatic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using GPT-4 and contains System messages Dialogs and conv_depth more than 5 with higher sequence lengths Used datasets are:\n\n\"oasst2\"\n\"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n\"vicgalle/alpaca-gpt4\"\n\"LDJnr/Capybara\"## Dataset Configuration\nThe dataset is configured as follows:## Features\nThe UltraChat-Mixin dataset consists of the following features:\n\n- prompt: A sequence of strings representing the conversation dialog Llama2 Style prompts.## Splits\nThe dataset contains a single split:\n\n- train: This split is used for training conversational AI models. It consists of 70_000 examples and has a size of approximately 168,057,889 bytes."
] |
6869c05e9d316b28e037cf9637b3ef41b1d559e5 |
# ChatMatic
## with Over 80,000 multi-turn examples.
UltraChat-Matic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using `GPT-4`
and contains
System messages Dialogs and conv_depth more than 5 with higher sequence lengths
Used datasets are:
1. "oasst2"
2. "ise-uiuc/Magicoder-Evol-Instruct-110K"
3. "vicgalle/alpaca-gpt4"
4. "LDJnr/Capybara"
### From Capybara
* Most tokens contained in this dataset are newly synthesized and did not exist prior online.
* This leverages the Amplify-Instruct method(paper coming soon) to grow thousands of high-quality single-turn seeds into advanced and in-depth multi-turn conversations.
* Average context length per conversation is over 1,000 tokens and 3 turns or more per example (most instruction/chat datasets on HF for fine-tuning are only 1 turn)
* Each conversation is optimized to amplify the natural raw knowledge capabilities of the model, as well as delving deep into obscure and advanced topics.
* Aggresively filtered to remove any and all possible examples of overt moralizing/alignment, and common undesirable behaviours such as "as an AI language model" and "September 2021" and "I don't have personal beliefs"
* ### More than 60000 Datas generated or selected by GPT4 | erfanzar/UltraChat-Matic | [
"task_categories:text-generation",
"task_categories:text-classification",
"task_categories:conversational",
"size_categories:1M<n<10M",
"language:en",
"language:es",
"language:ru",
"language:de",
"language:pl",
"language:th",
"language:vi",
"language:sv",
"language:bn",
"language:da",
"language:he",
"language:it",
"language:fa",
"language:sk",
"language:id",
"language:nb",
"language:el",
"language:nl",
"language:hu",
"language:eu",
"language:zh",
"language:eo",
"language:ja",
"language:ca",
"language:cs",
"language:bg",
"language:fi",
"language:pt",
"language:tr",
"language:ro",
"language:ar",
"language:uk",
"language:gl",
"language:fr",
"language:ko",
"code",
"biology",
"medical",
"region:us"
] | 2024-01-05T11:17:28+00:00 | {"language": ["en", "es", "ru", "de", "pl", "th", "vi", "sv", "bn", "da", "he", "it", "fa", "sk", "id", "nb", "el", "nl", "hu", "eu", "zh", "eo", "ja", "ca", "cs", "bg", "fi", "pt", "tr", "ro", "ar", "uk", "gl", "fr", "ko"], "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "text-classification", "conversational"], "dataset_info": {"features": [{"name": "system", "dtype": "string"}, {"name": "user", "sequence": "string"}, {"name": "assistant", "sequence": "string"}, {"name": "dialogs", "sequence": "string"}, {"name": "conv_depth", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 447216231, "num_examples": 109765}], "download_size": 242424003, "dataset_size": 447216231}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "tags": ["code", "biology", "medical"]} | 2024-01-06T10:23:35+00:00 | [] | [
"en",
"es",
"ru",
"de",
"pl",
"th",
"vi",
"sv",
"bn",
"da",
"he",
"it",
"fa",
"sk",
"id",
"nb",
"el",
"nl",
"hu",
"eu",
"zh",
"eo",
"ja",
"ca",
"cs",
"bg",
"fi",
"pt",
"tr",
"ro",
"ar",
"uk",
"gl",
"fr",
"ko"
] | TAGS
#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us
|
# ChatMatic
## with Over 80,000 multi-turn examples.
UltraChat-Matic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using 'GPT-4'
and contains
System messages Dialogs and conv_depth more than 5 with higher sequence lengths
Used datasets are:
1. "oasst2"
2. "ise-uiuc/Magicoder-Evol-Instruct-110K"
3. "vicgalle/alpaca-gpt4"
4. "LDJnr/Capybara"
### From Capybara
* Most tokens contained in this dataset are newly synthesized and did not exist prior online.
* This leverages the Amplify-Instruct method(paper coming soon) to grow thousands of high-quality single-turn seeds into advanced and in-depth multi-turn conversations.
* Average context length per conversation is over 1,000 tokens and 3 turns or more per example (most instruction/chat datasets on HF for fine-tuning are only 1 turn)
* Each conversation is optimized to amplify the natural raw knowledge capabilities of the model, as well as delving deep into obscure and advanced topics.
* Aggresively filtered to remove any and all possible examples of overt moralizing/alignment, and common undesirable behaviours such as "as an AI language model" and "September 2021" and "I don't have personal beliefs"
* ### More than 60000 Datas generated or selected by GPT4 | [
"# ChatMatic",
"## with Over 80,000 multi-turn examples.\n\nUltraChat-Matic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using 'GPT-4'\nand contains \nSystem messages Dialogs and conv_depth more than 5 with higher sequence lengths\nUsed datasets are:\n\n1. \"oasst2\"\n2. \"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n3. \"vicgalle/alpaca-gpt4\"\n4. \"LDJnr/Capybara\"",
"### From Capybara \n\n* Most tokens contained in this dataset are newly synthesized and did not exist prior online.\n\n* This leverages the Amplify-Instruct method(paper coming soon) to grow thousands of high-quality single-turn seeds into advanced and in-depth multi-turn conversations.\n\n* Average context length per conversation is over 1,000 tokens and 3 turns or more per example (most instruction/chat datasets on HF for fine-tuning are only 1 turn)\n\n* Each conversation is optimized to amplify the natural raw knowledge capabilities of the model, as well as delving deep into obscure and advanced topics.\n\n* Aggresively filtered to remove any and all possible examples of overt moralizing/alignment, and common undesirable behaviours such as \"as an AI language model\" and \"September 2021\" and \"I don't have personal beliefs\"\n\n* ### More than 60000 Datas generated or selected by GPT4"
] | [
"TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us \n",
"# ChatMatic",
"## with Over 80,000 multi-turn examples.\n\nUltraChat-Matic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using 'GPT-4'\nand contains \nSystem messages Dialogs and conv_depth more than 5 with higher sequence lengths\nUsed datasets are:\n\n1. \"oasst2\"\n2. \"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n3. \"vicgalle/alpaca-gpt4\"\n4. \"LDJnr/Capybara\"",
"### From Capybara \n\n* Most tokens contained in this dataset are newly synthesized and did not exist prior online.\n\n* This leverages the Amplify-Instruct method(paper coming soon) to grow thousands of high-quality single-turn seeds into advanced and in-depth multi-turn conversations.\n\n* Average context length per conversation is over 1,000 tokens and 3 turns or more per example (most instruction/chat datasets on HF for fine-tuning are only 1 turn)\n\n* Each conversation is optimized to amplify the natural raw knowledge capabilities of the model, as well as delving deep into obscure and advanced topics.\n\n* Aggresively filtered to remove any and all possible examples of overt moralizing/alignment, and common undesirable behaviours such as \"as an AI language model\" and \"September 2021\" and \"I don't have personal beliefs\"\n\n* ### More than 60000 Datas generated or selected by GPT4"
] | [
253,
4,
126,
223
] | [
"passage: TAGS\n#task_categories-text-generation #task_categories-text-classification #task_categories-conversational #size_categories-1M<n<10M #language-English #language-Spanish #language-Russian #language-German #language-Polish #language-Thai #language-Vietnamese #language-Swedish #language-Bengali #language-Danish #language-Hebrew #language-Italian #language-Persian #language-Slovak #language-Indonesian #language-Norwegian Bokmål #language-Modern Greek (1453-) #language-Dutch #language-Hungarian #language-Basque #language-Chinese #language-Esperanto #language-Japanese #language-Catalan #language-Czech #language-Bulgarian #language-Finnish #language-Portuguese #language-Turkish #language-Romanian #language-Arabic #language-Ukrainian #language-Galician #language-French #language-Korean #code #biology #medical #region-us \n# ChatMatic## with Over 80,000 multi-turn examples.\n\nUltraChat-Matic Dataset is built with mix of 4 other dataset and which carefully chosing best one from each one of them with using 'GPT-4'\nand contains \nSystem messages Dialogs and conv_depth more than 5 with higher sequence lengths\nUsed datasets are:\n\n1. \"oasst2\"\n2. \"ise-uiuc/Magicoder-Evol-Instruct-110K\"\n3. \"vicgalle/alpaca-gpt4\"\n4. \"LDJnr/Capybara\""
] |
8d39132f7c1ef7508dc4a6f044a301c5a74c8dfc |
# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZoidBB/Jovian-10.7B-v1.0](https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T22:02:39.167169](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0/blob/main/results_2024-01-05T22-02-39.167169.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6580437344612732,
"acc_stderr": 0.03198478309112164,
"acc_norm": 0.6603650951482142,
"acc_norm_stderr": 0.03262928239402026,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.5200231220903274,
"mc2_stderr": 0.015200059344934734
},
"harness|arc:challenge|25": {
"acc": 0.6510238907849829,
"acc_stderr": 0.013928933461382501,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693249
},
"harness|hellaswag|10": {
"acc": 0.6752638916550487,
"acc_stderr": 0.00467319142386121,
"acc_norm": 0.8639713204540929,
"acc_norm_stderr": 0.0034211839093201612
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233483,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233483
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771693,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771693
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959607,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959607
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48565840938722293,
"acc_stderr": 0.012764981829524277,
"acc_norm": 0.48565840938722293,
"acc_norm_stderr": 0.012764981829524277
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.02747227447323381,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.02747227447323381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960234,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960234
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306053,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306053
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965834,
"mc2": 0.5200231220903274,
"mc2_stderr": 0.015200059344934734
},
"harness|winogrande|5": {
"acc": 0.8184688239936859,
"acc_stderr": 0.01083327651500749
},
"harness|gsm8k|5": {
"acc": 0.5724033358605004,
"acc_stderr": 0.013627322286986815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0 | [
"region:us"
] | 2024-01-05T11:30:46+00:00 | {"pretty_name": "Evaluation run of ZoidBB/Jovian-10.7B-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [ZoidBB/Jovian-10.7B-v1.0](https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T22:02:39.167169](https://huggingface.co/datasets/open-llm-leaderboard/details_ZoidBB__Jovian-10.7B-v1.0/blob/main/results_2024-01-05T22-02-39.167169.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6580437344612732,\n \"acc_stderr\": 0.03198478309112164,\n \"acc_norm\": 0.6603650951482142,\n \"acc_norm_stderr\": 0.03262928239402026,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.5200231220903274,\n \"mc2_stderr\": 0.015200059344934734\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6510238907849829,\n \"acc_stderr\": 0.013928933461382501,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693249\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6752638916550487,\n \"acc_stderr\": 0.00467319142386121,\n \"acc_norm\": 0.8639713204540929,\n \"acc_norm_stderr\": 0.0034211839093201612\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554952,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554952\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233483,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233483\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n \"acc_stderr\": 0.024723861504771693,\n \"acc_norm\": 0.7459807073954984,\n \"acc_norm_stderr\": 0.024723861504771693\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n \"acc_stderr\": 0.012764981829524277,\n \"acc_norm\": 0.48565840938722293,\n \"acc_norm_stderr\": 0.012764981829524277\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.02747227447323381,\n \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.02747227447323381\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960234,\n \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960234\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306053,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306053\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965834,\n \"mc2\": 0.5200231220903274,\n \"mc2_stderr\": 0.015200059344934734\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8184688239936859,\n \"acc_stderr\": 0.01083327651500749\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \"acc_stderr\": 0.013627322286986815\n }\n}\n```", "repo_url": "https://huggingface.co/ZoidBB/Jovian-10.7B-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|arc:challenge|25_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|arc:challenge|25_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|gsm8k|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|gsm8k|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hellaswag|10_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hellaswag|10_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T11-28-30.707086.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["**/details_harness|winogrande|5_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["**/details_harness|winogrande|5_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T22-02-39.167169.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T11_28_30.707086", "path": ["results_2024-01-05T11-28-30.707086.parquet"]}, {"split": "2024_01_05T22_02_39.167169", "path": ["results_2024-01-05T22-02-39.167169.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T22-02-39.167169.parquet"]}]}]} | 2024-01-05T22:06:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0
Dataset automatically created during the evaluation run of model ZoidBB/Jovian-10.7B-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T22:02:39.167169(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/Jovian-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T22:02:39.167169(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/Jovian-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T22:02:39.167169(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
189,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ZoidBB/Jovian-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model ZoidBB/Jovian-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T22:02:39.167169(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
ca24862615b553fb163f79fdf14cb7381b8416b3 | 100% of credit for this dataset goes to Jon Durbin. This is the same dataset, converted to ultrafeedback binarized format, so it will work with hugging face allignment notebook DPO script.
original dataset: https://huggingface.co/datasets/unalignment/toxic-dpo-v0.1
this repo contains the script used to convert to the ultrafeedback format. Along with the dataset. | Vezora/testtoxic | [
"license:apache-2.0",
"region:us"
] | 2024-01-05T11:43:05+00:00 | {"license": "apache-2.0"} | 2024-01-08T23:41:17+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| 100% of credit for this dataset goes to Jon Durbin. This is the same dataset, converted to ultrafeedback binarized format, so it will work with hugging face allignment notebook DPO script.
original dataset: URL
this repo contains the script used to convert to the ultrafeedback format. Along with the dataset. | [] | [
"TAGS\n#license-apache-2.0 #region-us \n"
] | [
14
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
229bdbfdfe7fa9ab8ead6ade5a652a38ef7985e8 | [
{
"id": 1,
"text": "What is the capital of India?",
"answer": "New Delhi"
},
{
"id": 2,
"text": "Which river is the longest in India?",
"answer": "Ganges"
},
{
"id": 3,
"text": "In which year did India gain independence?",
"answer": "1947"
},
{
"id": 4,
"text": "What is the currency of India?",
"answer": "Indian Rupee"
},
{
"id": 5,
"text": "Which mountain range is in the northern part of India?",
"answer": "Himalayas"
}
] | SimnaFa/my_data | [
"region:us"
] | 2024-01-05T11:45:31+00:00 | {} | 2024-01-05T11:46:43+00:00 | [] | [] | TAGS
#region-us
| [
{
"id": 1,
"text": "What is the capital of India?",
"answer": "New Delhi"
},
{
"id": 2,
"text": "Which river is the longest in India?",
"answer": "Ganges"
},
{
"id": 3,
"text": "In which year did India gain independence?",
"answer": "1947"
},
{
"id": 4,
"text": "What is the currency of India?",
"answer": "Indian Rupee"
},
{
"id": 5,
"text": "Which mountain range is in the northern part of India?",
"answer": "Himalayas"
}
] | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
33d2498a5b7c8d4d47b3be000c76d4db9949463a |
# Dataset Card for Evaluation run of UCLA-AGI/test0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/test0](https://huggingface.co/UCLA-AGI/test0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:00:11.696869](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test0/blob/main/results_2024-01-05T12-00-11.696869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.60942938330715,
"acc_stderr": 0.032864296350108284,
"acc_norm": 0.6144795102535505,
"acc_norm_stderr": 0.03353247027217966,
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5048056232375348,
"mc2_stderr": 0.01591507188780232
},
"harness|arc:challenge|25": {
"acc": 0.6006825938566553,
"acc_stderr": 0.01431209455794671,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6543517227643896,
"acc_stderr": 0.00474607219107258,
"acc_norm": 0.844353714399522,
"acc_norm_stderr": 0.0036177879347477483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013316,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013316
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207762,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207762
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723882,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723882
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.02830465794303529,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.02830465794303529
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657565,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677006,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677006
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.025910063528240868,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.025910063528240868
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851486,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851486
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681397,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681397
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982055,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982055
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3525091799265606,
"mc1_stderr": 0.016724646380756547,
"mc2": 0.5048056232375348,
"mc2_stderr": 0.01591507188780232
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089686
},
"harness|gsm8k|5": {
"acc": 0.36694465504169826,
"acc_stderr": 0.013275883047712217
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__test0 | [
"region:us"
] | 2024-01-05T12:02:30+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/test0", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/test0](https://huggingface.co/UCLA-AGI/test0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T12:00:11.696869](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test0/blob/main/results_2024-01-05T12-00-11.696869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.60942938330715,\n \"acc_stderr\": 0.032864296350108284,\n \"acc_norm\": 0.6144795102535505,\n \"acc_norm_stderr\": 0.03353247027217966,\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5048056232375348,\n \"mc2_stderr\": 0.01591507188780232\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6006825938566553,\n \"acc_stderr\": 0.01431209455794671,\n \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6543517227643896,\n \"acc_stderr\": 0.00474607219107258,\n \"acc_norm\": 0.844353714399522,\n \"acc_norm_stderr\": 0.0036177879347477483\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013316,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013316\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207762,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207762\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723882,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723882\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.02830465794303529,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.02830465794303529\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n \"acc_stderr\": 0.014143970276657565,\n \"acc_norm\": 0.8058748403575989,\n \"acc_norm_stderr\": 0.014143970276657565\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677006,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677006\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.025910063528240868,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.025910063528240868\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n \"acc_stderr\": 0.012704030518851486,\n \"acc_norm\": 0.4491525423728814,\n \"acc_norm_stderr\": 0.012704030518851486\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681397,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681397\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982055,\n \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982055\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3525091799265606,\n \"mc1_stderr\": 0.016724646380756547,\n \"mc2\": 0.5048056232375348,\n \"mc2_stderr\": 0.01591507188780232\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089686\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36694465504169826,\n \"acc_stderr\": 0.013275883047712217\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/test0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-00-11.696869.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["**/details_harness|winogrande|5_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T12-00-11.696869.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T12_00_11.696869", "path": ["results_2024-01-05T12-00-11.696869.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T12-00-11.696869.parquet"]}]}]} | 2024-01-05T12:02:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/test0
Dataset automatically created during the evaluation run of model UCLA-AGI/test0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T12:00:11.696869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/test0\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:00:11.696869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/test0\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:00:11.696869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of UCLA-AGI/test0\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T12:00:11.696869(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8564bbb5ef998c3cfbbe642914b6c3c0c0bfcb00 |
# Dataset Card for Evaluation run of uukuguy/speechless-mistral-moloras-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-mistral-moloras-7b](https://huggingface.co/uukuguy/speechless-mistral-moloras-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-mistral-moloras-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:03:44.499020](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-moloras-7b/blob/main/results_2024-01-05T12-03-44.499020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377826349872993,
"acc_stderr": 0.03226647554093914,
"acc_norm": 0.6437188756798331,
"acc_norm_stderr": 0.03291664382173368,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809172
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253078,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155257,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.37680060652009095,
"acc_stderr": 0.013347858757829158
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-mistral-moloras-7b | [
"region:us"
] | 2024-01-05T12:06:01+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-mistral-moloras-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-mistral-moloras-7b](https://huggingface.co/uukuguy/speechless-mistral-moloras-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-mistral-moloras-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T12:03:44.499020](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-moloras-7b/blob/main/results_2024-01-05T12-03-44.499020.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377826349872993,\n \"acc_stderr\": 0.03226647554093914,\n \"acc_norm\": 0.6437188756798331,\n \"acc_norm_stderr\": 0.03291664382173368,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809172\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n \"acc_stderr\": 0.004820166002253078,\n \"acc_norm\": 0.8329018123879706,\n \"acc_norm_stderr\": 0.0037230107458783913\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155257,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37680060652009095,\n \"acc_stderr\": 0.013347858757829158\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-mistral-moloras-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-03-44.499020.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["**/details_harness|winogrande|5_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T12-03-44.499020.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T12_03_44.499020", "path": ["results_2024-01-05T12-03-44.499020.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T12-03-44.499020.parquet"]}]}]} | 2024-01-05T12:06:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-mistral-moloras-7b
Dataset automatically created during the evaluation run of model uukuguy/speechless-mistral-moloras-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T12:03:44.499020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-mistral-moloras-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-moloras-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:03:44.499020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-mistral-moloras-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-moloras-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:03:44.499020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-mistral-moloras-7b\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-moloras-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T12:03:44.499020(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
b4b2788177bcd10edc343ffd5969268e8ea99500 |
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T12:10:45.462405](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT/blob/main/results_2024-01-05T12-10-45.462405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2919655714473898,
"acc_stderr": 0.0318639028810806,
"acc_norm": 0.2944023668702236,
"acc_norm_stderr": 0.032711391877200874,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610062,
"mc2": 0.4771392382771529,
"mc2_stderr": 0.015567072294317703
},
"harness|arc:challenge|25": {
"acc": 0.3779863481228669,
"acc_stderr": 0.014169664520303101,
"acc_norm": 0.4104095563139932,
"acc_norm_stderr": 0.014374922192642666
},
"harness|hellaswag|10": {
"acc": 0.5435172276438957,
"acc_stderr": 0.00497084669755231,
"acc_norm": 0.7126070503883688,
"acc_norm_stderr": 0.004516215206715344
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416545,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416545
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412417,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003336,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918424,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.0259885007924119,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.0259885007924119
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3393939393939394,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.3393939393939394,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048574,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048574
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.02317740813146593,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.02317740813146593
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2689075630252101,
"acc_stderr": 0.028801392193631276,
"acc_norm": 0.2689075630252101,
"acc_norm_stderr": 0.028801392193631276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24770642201834864,
"acc_stderr": 0.018508143602547808,
"acc_norm": 0.24770642201834864,
"acc_norm_stderr": 0.018508143602547808
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.30092592592592593,
"acc_stderr": 0.031280390843298825,
"acc_norm": 0.30092592592592593,
"acc_norm_stderr": 0.031280390843298825
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.032834720561085676,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.032834720561085676
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.32489451476793246,
"acc_stderr": 0.030486039389105303,
"acc_norm": 0.32489451476793246,
"acc_norm_stderr": 0.030486039389105303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3094170403587444,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.3094170403587444,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467762,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467762
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349472,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.35887611749680715,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.35887611749680715,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.02519018132760841,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.02519018132760841
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961441,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.31699346405228757,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.31699346405228757,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.0266644108869376,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.0266644108869376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32098765432098764,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.32098765432098764,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2796610169491525,
"acc_stderr": 0.011463397393861973,
"acc_norm": 0.2796610169491525,
"acc_norm_stderr": 0.011463397393861973
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2973856209150327,
"acc_stderr": 0.018492596536396955,
"acc_norm": 0.2973856209150327,
"acc_norm_stderr": 0.018492596536396955
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3551020408163265,
"acc_stderr": 0.03063565515038764,
"acc_norm": 0.3551020408163265,
"acc_norm_stderr": 0.03063565515038764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2835820895522388,
"acc_stderr": 0.031871875379197986,
"acc_norm": 0.2835820895522388,
"acc_norm_stderr": 0.031871875379197986
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511115,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511115
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610062,
"mc2": 0.4771392382771529,
"mc2_stderr": 0.015567072294317703
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.013476581172567528
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT | [
"region:us"
] | 2024-01-05T12:13:06+00:00 | {"pretty_name": "Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT", "dataset_summary": "Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T12:10:45.462405](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B-ShareGPT/blob/main/results_2024-01-05T12-10-45.462405.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2919655714473898,\n \"acc_stderr\": 0.0318639028810806,\n \"acc_norm\": 0.2944023668702236,\n \"acc_norm_stderr\": 0.032711391877200874,\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.01605899902610062,\n \"mc2\": 0.4771392382771529,\n \"mc2_stderr\": 0.015567072294317703\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3779863481228669,\n \"acc_stderr\": 0.014169664520303101,\n \"acc_norm\": 0.4104095563139932,\n \"acc_norm_stderr\": 0.014374922192642666\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5435172276438957,\n \"acc_stderr\": 0.00497084669755231,\n \"acc_norm\": 0.7126070503883688,\n \"acc_norm_stderr\": 0.004516215206715344\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.02906722014664483,\n \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.02906722014664483\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416545,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416545\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412417,\n \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412417\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003336,\n \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003336\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918424,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918424\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n \"acc_stderr\": 0.0259885007924119,\n \"acc_norm\": 0.2967741935483871,\n \"acc_norm_stderr\": 0.0259885007924119\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444444,\n \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444444\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.3393939393939394,\n \"acc_stderr\": 0.03697442205031595,\n \"acc_norm\": 0.3393939393939394,\n \"acc_norm_stderr\": 0.03697442205031595\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048574,\n \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048574\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.02317740813146593,\n \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.02317740813146593\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2689075630252101,\n \"acc_stderr\": 0.028801392193631276,\n \"acc_norm\": 0.2689075630252101,\n \"acc_norm_stderr\": 0.028801392193631276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.24770642201834864,\n \"acc_stderr\": 0.018508143602547808,\n \"acc_norm\": 0.24770642201834864,\n \"acc_norm_stderr\": 0.018508143602547808\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.30092592592592593,\n \"acc_stderr\": 0.031280390843298825,\n \"acc_norm\": 0.30092592592592593,\n \"acc_norm_stderr\": 0.031280390843298825\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.032834720561085676,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.032834720561085676\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.32489451476793246,\n \"acc_stderr\": 0.030486039389105303,\n \"acc_norm\": 0.32489451476793246,\n \"acc_norm_stderr\": 0.030486039389105303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3094170403587444,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.3094170403587444,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.39669421487603307,\n \"acc_stderr\": 0.044658697805310094,\n \"acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.044658697805310094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n \"acc_stderr\": 0.04203277291467762,\n \"acc_norm\": 0.26785714285714285,\n \"acc_norm_stderr\": 0.04203277291467762\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n \"acc_stderr\": 0.029996951858349472,\n \"acc_norm\": 0.29914529914529914,\n \"acc_norm_stderr\": 0.029996951858349472\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.35887611749680715,\n \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.35887611749680715,\n \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.02519018132760841,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.02519018132760841\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961441,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961441\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.31699346405228757,\n \"acc_stderr\": 0.02664327847450875,\n \"acc_norm\": 0.31699346405228757,\n \"acc_norm_stderr\": 0.02664327847450875\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n \"acc_stderr\": 0.0266644108869376,\n \"acc_norm\": 0.3279742765273312,\n \"acc_norm_stderr\": 0.0266644108869376\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.32098765432098764,\n \"acc_stderr\": 0.02597656601086274,\n \"acc_norm\": 0.32098765432098764,\n \"acc_norm_stderr\": 0.02597656601086274\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2796610169491525,\n \"acc_stderr\": 0.011463397393861973,\n \"acc_norm\": 0.2796610169491525,\n \"acc_norm_stderr\": 0.011463397393861973\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2973856209150327,\n \"acc_stderr\": 0.018492596536396955,\n \"acc_norm\": 0.2973856209150327,\n \"acc_norm_stderr\": 0.018492596536396955\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3551020408163265,\n \"acc_stderr\": 0.03063565515038764,\n \"acc_norm\": 0.3551020408163265,\n \"acc_norm_stderr\": 0.03063565515038764\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2835820895522388,\n \"acc_stderr\": 0.031871875379197986,\n \"acc_norm\": 0.2835820895522388,\n \"acc_norm_stderr\": 0.031871875379197986\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n \"acc_stderr\": 0.03529486801511115,\n \"acc_norm\": 0.2891566265060241,\n \"acc_norm_stderr\": 0.03529486801511115\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n \"mc1_stderr\": 0.01605899902610062,\n \"mc2\": 0.4771392382771529,\n \"mc2_stderr\": 0.015567072294317703\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6416732438831886,\n \"acc_stderr\": 0.013476581172567528\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["**/details_harness|winogrande|5_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T12-10-45.462405.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T12_10_45.462405", "path": ["results_2024-01-05T12-10-45.462405.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T12-10-45.462405.parquet"]}]}]} | 2024-01-05T12:13:30+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
Dataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T12:10:45.462405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:10:45.462405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T12:10:45.462405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT\n\n\n\nDataset automatically created during the evaluation run of model princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T12:10:45.462405(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
3bbcc8ee14abc564bdc24c42e3e18c901f99ce57 | # Syntetic emails. Split by inquiry and answer email
Consume and run into a notebook [colab](https://colab.research.google.com/drive/1PEQyJO1-f6j0S_XJ8DV50NkpzasXkrzd?usp=sharing#scrollTo=OJXpOgBFuSrc)
You can switch the dataset there from `mlabonne/guanaco-llama2-1k` to `spielhoelle/film-equipment-order-emails` | spielhoelle/film-equipment-order-emails | [
"region:us"
] | 2024-01-05T12:58:15+00:00 | {"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1294193, "num_examples": 1000}], "download_size": 397427, "dataset_size": 1294193}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-09T23:55:51+00:00 | [] | [] | TAGS
#region-us
| # Syntetic emails. Split by inquiry and answer email
Consume and run into a notebook colab
You can switch the dataset there from 'mlabonne/guanaco-llama2-1k' to 'spielhoelle/film-equipment-order-emails' | [
"# Syntetic emails. Split by inquiry and answer email\n\nConsume and run into a notebook colab\nYou can switch the dataset there from 'mlabonne/guanaco-llama2-1k' to 'spielhoelle/film-equipment-order-emails'"
] | [
"TAGS\n#region-us \n",
"# Syntetic emails. Split by inquiry and answer email\n\nConsume and run into a notebook colab\nYou can switch the dataset there from 'mlabonne/guanaco-llama2-1k' to 'spielhoelle/film-equipment-order-emails'"
] | [
6,
61
] | [
"passage: TAGS\n#region-us \n# Syntetic emails. Split by inquiry and answer email\n\nConsume and run into a notebook colab\nYou can switch the dataset there from 'mlabonne/guanaco-llama2-1k' to 'spielhoelle/film-equipment-order-emails'"
] |
1e7cac82bc53702a6edc1693f27301fb60efa01a | # Dataset Card for "myriade_noun_wsd_bis2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | gguichard/myriade_noun_wsd_bis2 | [
"region:us"
] | 2024-01-05T13:12:57+00:00 | {"dataset_info": {"features": [{"name": "tokens", "sequence": "string"}, {"name": "wn_sens", "sequence": "int64"}, {"name": "input_ids", "sequence": "int32"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 75431443, "num_examples": 124552}], "download_size": 15044940, "dataset_size": 75431443}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-05T13:13:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "myriade_noun_wsd_bis2"
More Information needed | [
"# Dataset Card for \"myriade_noun_wsd_bis2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"myriade_noun_wsd_bis2\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"myriade_noun_wsd_bis2\"\n\nMore Information needed"
] |
c59bee00c2d6050a2fcecb309ebfe81b38e411da |
# Dataset Card for Evaluation run of NurtureAI/MistralLite-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NurtureAI/MistralLite-11B](https://huggingface.co/NurtureAI/MistralLite-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__MistralLite-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T13:36:51.362118](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__MistralLite-11B/blob/main/results_2024-01-05T13-36-51.362118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.49950728058879285,
"acc_stderr": 0.034455825054960136,
"acc_norm": 0.5071918593640932,
"acc_norm_stderr": 0.035377074137943484,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.3826978228462617,
"mc2_stderr": 0.01476557964587134
},
"harness|arc:challenge|25": {
"acc": 0.5409556313993175,
"acc_stderr": 0.014562291073601227,
"acc_norm": 0.5767918088737202,
"acc_norm_stderr": 0.014438036220848029
},
"harness|hellaswag|10": {
"acc": 0.606652061342362,
"acc_stderr": 0.00487494583394707,
"acc_norm": 0.7953594901414061,
"acc_norm_stderr": 0.0040261416901312365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.03053333843046752,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.03053333843046752
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0242785680243077,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0242785680243077
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47419354838709676,
"acc_stderr": 0.02840609505765332,
"acc_norm": 0.47419354838709676,
"acc_norm_stderr": 0.02840609505765332
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.034767257476490364,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.034767257476490364
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.036462049632538115,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.036462049632538115
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5656565656565656,
"acc_stderr": 0.035315058793591834,
"acc_norm": 0.5656565656565656,
"acc_norm_stderr": 0.035315058793591834
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845447,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5512820512820513,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.5512820512820513,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5908256880733945,
"acc_stderr": 0.02108067026443373,
"acc_norm": 0.5908256880733945,
"acc_norm_stderr": 0.02108067026443373
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5336322869955157,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.5336322869955157,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.039277056007874414,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.039277056007874414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.04750458399041697,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.04750458399041697
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5982905982905983,
"acc_stderr": 0.03211693751051622,
"acc_norm": 0.5982905982905983,
"acc_norm_stderr": 0.03211693751051622
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6053639846743295,
"acc_stderr": 0.017478464305911542,
"acc_norm": 0.6053639846743295,
"acc_norm_stderr": 0.017478464305911542
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952226,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952226
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28044692737430166,
"acc_stderr": 0.015024083883322891,
"acc_norm": 0.28044692737430166,
"acc_norm_stderr": 0.015024083883322891
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.028599936776089782,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.028599936776089782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.02784647600593047,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.02784647600593047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.027744313443376536,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.027744313443376536
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596143,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596143
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534354,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455504,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455504
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5212418300653595,
"acc_stderr": 0.020209572388600244,
"acc_norm": 0.5212418300653595,
"acc_norm_stderr": 0.020209572388600244
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794918,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794918
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.46766169154228854,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.46766169154228854,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.03722965741385539,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.03722965741385539
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.3826978228462617,
"mc2_stderr": 0.01476557964587134
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501917
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_NurtureAI__MistralLite-11B | [
"region:us"
] | 2024-01-05T13:39:07+00:00 | {"pretty_name": "Evaluation run of NurtureAI/MistralLite-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [NurtureAI/MistralLite-11B](https://huggingface.co/NurtureAI/MistralLite-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__MistralLite-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T13:36:51.362118](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__MistralLite-11B/blob/main/results_2024-01-05T13-36-51.362118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.49950728058879285,\n \"acc_stderr\": 0.034455825054960136,\n \"acc_norm\": 0.5071918593640932,\n \"acc_norm_stderr\": 0.035377074137943484,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.3826978228462617,\n \"mc2_stderr\": 0.01476557964587134\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5409556313993175,\n \"acc_stderr\": 0.014562291073601227,\n \"acc_norm\": 0.5767918088737202,\n \"acc_norm_stderr\": 0.014438036220848029\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.606652061342362,\n \"acc_stderr\": 0.00487494583394707,\n \"acc_norm\": 0.7953594901414061,\n \"acc_norm_stderr\": 0.0040261416901312365\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.04065771002562605,\n \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.04065771002562605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.03053333843046752,\n \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.03053333843046752\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0242785680243077,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0242785680243077\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47419354838709676,\n \"acc_stderr\": 0.02840609505765332,\n \"acc_norm\": 0.47419354838709676,\n \"acc_norm_stderr\": 0.02840609505765332\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.036462049632538115,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.036462049632538115\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5656565656565656,\n \"acc_stderr\": 0.035315058793591834,\n \"acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.035315058793591834\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845447,\n \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5512820512820513,\n \"acc_stderr\": 0.025217315184846486,\n \"acc_norm\": 0.5512820512820513,\n \"acc_norm_stderr\": 0.025217315184846486\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5908256880733945,\n \"acc_stderr\": 0.02108067026443373,\n \"acc_norm\": 0.5908256880733945,\n \"acc_norm_stderr\": 0.02108067026443373\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5336322869955157,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.5336322869955157,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.039277056007874414,\n \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.039277056007874414\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.04750458399041697,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.04750458399041697\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5982905982905983,\n \"acc_stderr\": 0.03211693751051622,\n \"acc_norm\": 0.5982905982905983,\n \"acc_norm_stderr\": 0.03211693751051622\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6053639846743295,\n \"acc_stderr\": 0.017478464305911542,\n \"acc_norm\": 0.6053639846743295,\n \"acc_norm_stderr\": 0.017478464305911542\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952226,\n \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952226\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28044692737430166,\n \"acc_stderr\": 0.015024083883322891,\n \"acc_norm\": 0.28044692737430166,\n \"acc_norm_stderr\": 0.015024083883322891\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5228758169934641,\n \"acc_stderr\": 0.028599936776089782,\n \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.028599936776089782\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n \"acc_stderr\": 0.02784647600593047,\n \"acc_norm\": 0.5980707395498392,\n \"acc_norm_stderr\": 0.02784647600593047\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.027744313443376536,\n \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.027744313443376536\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n \"acc_stderr\": 0.012564871542534354,\n \"acc_norm\": 0.4106910039113429,\n \"acc_norm_stderr\": 0.012564871542534354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455504,\n \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455504\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5212418300653595,\n \"acc_stderr\": 0.020209572388600244,\n \"acc_norm\": 0.5212418300653595,\n \"acc_norm_stderr\": 0.020209572388600244\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n \"acc_stderr\": 0.04785964010794918,\n \"acc_norm\": 0.4818181818181818,\n \"acc_norm_stderr\": 0.04785964010794918\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.46766169154228854,\n \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.46766169154228854,\n \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.3826978228462617,\n \"mc2_stderr\": 0.01476557964587134\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501917\n }\n}\n```", "repo_url": "https://huggingface.co/NurtureAI/MistralLite-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|arc:challenge|25_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|gsm8k|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hellaswag|10_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T13-36-51.362118.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["**/details_harness|winogrande|5_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T13-36-51.362118.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T13_36_51.362118", "path": ["results_2024-01-05T13-36-51.362118.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T13-36-51.362118.parquet"]}]}]} | 2024-01-05T13:39:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of NurtureAI/MistralLite-11B
Dataset automatically created during the evaluation run of model NurtureAI/MistralLite-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T13:36:51.362118(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of NurtureAI/MistralLite-11B\n\n\n\nDataset automatically created during the evaluation run of model NurtureAI/MistralLite-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T13:36:51.362118(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NurtureAI/MistralLite-11B\n\n\n\nDataset automatically created during the evaluation run of model NurtureAI/MistralLite-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T13:36:51.362118(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NurtureAI/MistralLite-11B\n\n\n\nDataset automatically created during the evaluation run of model NurtureAI/MistralLite-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T13:36:51.362118(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
220f0b2f949bad79dcb5fed627abc88542b3ba9e | # Open Assistant 2 Top English Curated
## Dataset Details
### Dataset Description
A filtered and curated dataset taken from the top scoring https://huggingface.co/datasets/OpenAssistant/oasst2 conversations. Saved in HF Chat format. The result is a high quality dataset for SFT.
- **Created by:** [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai)
- **License:** Apache 2.0
## Dataset Structure
We structure the dataset using the format commonly used as input into [Hugging Face Chat Templates](https://huggingface.co/docs/transformers/chat_templating):
```
[
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing great. How can I help you today?"}
]
```
## Dataset Creation
### Source Data
- **Source Dataset:** https://huggingface.co/datasets/OpenAssistant/oasst2
#### Data Collection and Processing
We started with the top_k=1 English only conversations from https://huggingface.co/datasets/OpenAssistant/oasst2.
Filtering and curation was done to remove conversations with:
- Duplicate or very similar responses
- Responses where the AI was actually responding like a person (present in this dataset as the responses are created by humans pretending to be an AI, and no everyone followed these instructions closely)
- Profanity or inappropriate responses for an AI
- Very short response lengths (often below 50 or 200 characters)
- URLs
# License
- **License:** Apache 2.0
This dataset is usable for commercial purposes.
# Contact
Created by [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai) | sablo/oasst2_curated | [
"region:us"
] | 2024-01-05T13:44:07+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 9014169, "num_examples": 4693}, {"name": "test", "num_bytes": 479119, "num_examples": 247}], "download_size": 5127472, "dataset_size": 9493288}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-12T10:37:47+00:00 | [] | [] | TAGS
#region-us
| # Open Assistant 2 Top English Curated
## Dataset Details
### Dataset Description
A filtered and curated dataset taken from the top scoring URL conversations. Saved in HF Chat format. The result is a high quality dataset for SFT.
- Created by: dctanner and the team at Sablo AI
- License: Apache 2.0
## Dataset Structure
We structure the dataset using the format commonly used as input into Hugging Face Chat Templates:
## Dataset Creation
### Source Data
- Source Dataset: URL
#### Data Collection and Processing
We started with the top_k=1 English only conversations from URL
Filtering and curation was done to remove conversations with:
- Duplicate or very similar responses
- Responses where the AI was actually responding like a person (present in this dataset as the responses are created by humans pretending to be an AI, and no everyone followed these instructions closely)
- Profanity or inappropriate responses for an AI
- Very short response lengths (often below 50 or 200 characters)
- URLs
# License
- License: Apache 2.0
This dataset is usable for commercial purposes.
# Contact
Created by dctanner and the team at Sablo AI | [
"# Open Assistant 2 Top English Curated",
"## Dataset Details",
"### Dataset Description\n\nA filtered and curated dataset taken from the top scoring URL conversations. Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: Apache 2.0",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates:",
"## Dataset Creation",
"### Source Data\n\n- Source Dataset: URL",
"#### Data Collection and Processing\n\nWe started with the top_k=1 English only conversations from URL\n\nFiltering and curation was done to remove conversations with:\n- Duplicate or very similar responses\n- Responses where the AI was actually responding like a person (present in this dataset as the responses are created by humans pretending to be an AI, and no everyone followed these instructions closely)\n- Profanity or inappropriate responses for an AI\n- Very short response lengths (often below 50 or 200 characters)\n- URLs",
"# License\n\n- License: Apache 2.0\n\nThis dataset is usable for commercial purposes.",
"# Contact\n\nCreated by dctanner and the team at Sablo AI"
] | [
"TAGS\n#region-us \n",
"# Open Assistant 2 Top English Curated",
"## Dataset Details",
"### Dataset Description\n\nA filtered and curated dataset taken from the top scoring URL conversations. Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: Apache 2.0",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates:",
"## Dataset Creation",
"### Source Data\n\n- Source Dataset: URL",
"#### Data Collection and Processing\n\nWe started with the top_k=1 English only conversations from URL\n\nFiltering and curation was done to remove conversations with:\n- Duplicate or very similar responses\n- Responses where the AI was actually responding like a person (present in this dataset as the responses are created by humans pretending to be an AI, and no everyone followed these instructions closely)\n- Profanity or inappropriate responses for an AI\n- Very short response lengths (often below 50 or 200 characters)\n- URLs",
"# License\n\n- License: Apache 2.0\n\nThis dataset is usable for commercial purposes.",
"# Contact\n\nCreated by dctanner and the team at Sablo AI"
] | [
6,
8,
4,
64,
27,
5,
10,
117,
19,
15
] | [
"passage: TAGS\n#region-us \n# Open Assistant 2 Top English Curated## Dataset Details### Dataset Description\n\nA filtered and curated dataset taken from the top scoring URL conversations. Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: Apache 2.0## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates:## Dataset Creation### Source Data\n\n- Source Dataset: URL#### Data Collection and Processing\n\nWe started with the top_k=1 English only conversations from URL\n\nFiltering and curation was done to remove conversations with:\n- Duplicate or very similar responses\n- Responses where the AI was actually responding like a person (present in this dataset as the responses are created by humans pretending to be an AI, and no everyone followed these instructions closely)\n- Profanity or inappropriate responses for an AI\n- Very short response lengths (often below 50 or 200 characters)\n- URLs# License\n\n- License: Apache 2.0\n\nThis dataset is usable for commercial purposes.# Contact\n\nCreated by dctanner and the team at Sablo AI"
] |
a6e983e98a9c2e1a11a77b2d5693df6c6642e28a | # Dataset Card for "finetune-test-eiffel_dataset_20"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | awabrizwan/finetune-test-eiffel_dataset_20 | [
"region:us"
] | 2024-01-05T13:48:52+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6166532.0, "num_examples": 20}], "download_size": 6168067, "dataset_size": 6166532.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-05T13:48:59+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "finetune-test-eiffel_dataset_20"
More Information needed | [
"# Dataset Card for \"finetune-test-eiffel_dataset_20\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"finetune-test-eiffel_dataset_20\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"finetune-test-eiffel_dataset_20\"\n\nMore Information needed"
] |
b9733e070d49a79c6747791eb08b2356fc51dbce | # Dataset Card for "araproje_hellaswag_tr_conf1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_tr_conf1 | [
"region:us"
] | 2024-01-05T13:55:01+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 162703.0, "num_examples": 250}], "download_size": 0, "dataset_size": 162703.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T15:34:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_hellaswag_tr_conf1"
More Information needed | [
"# Dataset Card for \"araproje_hellaswag_tr_conf1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_hellaswag_tr_conf1\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_hellaswag_tr_conf1\"\n\nMore Information needed"
] |
bcdfba77089974916dc31f30e24768a29322256a |
# Wikipedia Crypto Articles 🪙₿
This dataset is a collection of articles obtained from Wikipedia on January 5ᵗʰ, 2024. It contains two columns, title and article, containing the article's title as it is on the Wikipedia website and the article's content.
The articles vary from specific cryptocurrencies—such as Bitcoin or Ethereum—to historical facts, companies, exchanges, entities, and relevant people in the history of cryptocurrencies.
This dataset can be used to train machine learning models for question-answering tasks, summarization, conversation, named entity recognition, etc. | luisotorres/wikipedia-crypto-articles | [
"language:en",
"license:cc-by-sa-3.0",
"Wikipedia",
"Text",
"Article",
"finance",
"article",
"region:us"
] | 2024-01-05T13:58:56+00:00 | {"language": ["en"], "license": "cc-by-sa-3.0", "tags": ["Wikipedia", "Text", "Article", "finance", "article"]} | 2024-01-05T14:02:30+00:00 | [] | [
"en"
] | TAGS
#language-English #license-cc-by-sa-3.0 #Wikipedia #Text #Article #finance #article #region-us
|
# Wikipedia Crypto Articles ₿
This dataset is a collection of articles obtained from Wikipedia on January 5ᵗʰ, 2024. It contains two columns, title and article, containing the article's title as it is on the Wikipedia website and the article's content.
The articles vary from specific cryptocurrencies—such as Bitcoin or Ethereum—to historical facts, companies, exchanges, entities, and relevant people in the history of cryptocurrencies.
This dataset can be used to train machine learning models for question-answering tasks, summarization, conversation, named entity recognition, etc. | [
"# Wikipedia Crypto Articles ₿\n\nThis dataset is a collection of articles obtained from Wikipedia on January 5ᵗʰ, 2024. It contains two columns, title and article, containing the article's title as it is on the Wikipedia website and the article's content.\n\nThe articles vary from specific cryptocurrencies—such as Bitcoin or Ethereum—to historical facts, companies, exchanges, entities, and relevant people in the history of cryptocurrencies.\n\nThis dataset can be used to train machine learning models for question-answering tasks, summarization, conversation, named entity recognition, etc."
] | [
"TAGS\n#language-English #license-cc-by-sa-3.0 #Wikipedia #Text #Article #finance #article #region-us \n",
"# Wikipedia Crypto Articles ₿\n\nThis dataset is a collection of articles obtained from Wikipedia on January 5ᵗʰ, 2024. It contains two columns, title and article, containing the article's title as it is on the Wikipedia website and the article's content.\n\nThe articles vary from specific cryptocurrencies—such as Bitcoin or Ethereum—to historical facts, companies, exchanges, entities, and relevant people in the history of cryptocurrencies.\n\nThis dataset can be used to train machine learning models for question-answering tasks, summarization, conversation, named entity recognition, etc."
] | [
33,
135
] | [
"passage: TAGS\n#language-English #license-cc-by-sa-3.0 #Wikipedia #Text #Article #finance #article #region-us \n# Wikipedia Crypto Articles ₿\n\nThis dataset is a collection of articles obtained from Wikipedia on January 5ᵗʰ, 2024. It contains two columns, title and article, containing the article's title as it is on the Wikipedia website and the article's content.\n\nThe articles vary from specific cryptocurrencies—such as Bitcoin or Ethereum—to historical facts, companies, exchanges, entities, and relevant people in the history of cryptocurrencies.\n\nThis dataset can be used to train machine learning models for question-answering tasks, summarization, conversation, named entity recognition, etc."
] |
7d874d7428808ee5960208114b59d8d6e5844479 |
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-2](https://huggingface.co/nlpguy/Hermes-low-tune-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T13:59:33.272174](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2/blob/main/results_2024-01-05T13-59-33.272174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6389638892457566,
"acc_stderr": 0.03228226820237424,
"acc_norm": 0.6407807294820688,
"acc_norm_stderr": 0.03292777968100128,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5318336325194422,
"mc2_stderr": 0.01508871153008636
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670733,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6512646883091018,
"acc_stderr": 0.004755960559929163,
"acc_norm": 0.8446524596693886,
"acc_norm_stderr": 0.0036149536450656443
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800897,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800897
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721164,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721164
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005564,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005564
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045702,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045702
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5318336325194422,
"mc2_stderr": 0.01508871153008636
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662247
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2 | [
"region:us"
] | 2024-01-05T14:00:57+00:00 | {"pretty_name": "Evaluation run of nlpguy/Hermes-low-tune-2", "dataset_summary": "Dataset automatically created during the evaluation run of model [nlpguy/Hermes-low-tune-2](https://huggingface.co/nlpguy/Hermes-low-tune-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T13:59:33.272174](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__Hermes-low-tune-2/blob/main/results_2024-01-05T13-59-33.272174.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6389638892457566,\n \"acc_stderr\": 0.03228226820237424,\n \"acc_norm\": 0.6407807294820688,\n \"acc_norm_stderr\": 0.03292777968100128,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5318336325194422,\n \"mc2_stderr\": 0.01508871153008636\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670733,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6512646883091018,\n \"acc_stderr\": 0.004755960559929163,\n \"acc_norm\": 0.8446524596693886,\n \"acc_norm_stderr\": 0.0036149536450656443\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800897,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800897\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.032555253593403555,\n \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.032555253593403555\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721164,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721164\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n \"acc_stderr\": 0.015476515438005564,\n \"acc_norm\": 0.3106145251396648,\n \"acc_norm_stderr\": 0.015476515438005564\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n \"acc_stderr\": 0.012739711554045702,\n \"acc_norm\": 0.4654498044328553,\n \"acc_norm_stderr\": 0.012739711554045702\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5318336325194422,\n \"mc2_stderr\": 0.01508871153008636\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \"acc_stderr\": 0.013258428375662247\n }\n}\n```", "repo_url": "https://huggingface.co/nlpguy/Hermes-low-tune-2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|arc:challenge|25_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|arc:challenge|25_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|gsm8k|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|gsm8k|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hellaswag|10_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hellaswag|10_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T13-58-35.823625.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["**/details_harness|winogrande|5_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["**/details_harness|winogrande|5_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T13-59-33.272174.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T13_58_35.823625", "path": ["results_2024-01-05T13-58-35.823625.parquet"]}, {"split": "2024_01_05T13_59_33.272174", "path": ["results_2024-01-05T13-59-33.272174.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T13-59-33.272174.parquet"]}]}]} | 2024-01-05T14:02:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2
Dataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T13:59:33.272174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T13:59:33.272174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T13:59:33.272174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nlpguy/Hermes-low-tune-2\n\n\n\nDataset automatically created during the evaluation run of model nlpguy/Hermes-low-tune-2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T13:59:33.272174(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
895a83e6c55d4243f66227f0cb18daba8f6b03c6 |
# Dataset Card for Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chargoddard/mixtralmerge-8x7B-rebalanced-test](https://huggingface.co/chargoddard/mixtralmerge-8x7B-rebalanced-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__mixtralmerge-8x7B-rebalanced-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T14:01:45.060324](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__mixtralmerge-8x7B-rebalanced-test/blob/main/results_2024-01-05T14-01-45.060324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7031585372717768,
"acc_stderr": 0.030406769723018957,
"acc_norm": 0.7069430171554243,
"acc_norm_stderr": 0.030996497825775828,
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248814,
"mc2": 0.5375225566743017,
"mc2_stderr": 0.014992803866242046
},
"harness|arc:challenge|25": {
"acc": 0.6552901023890785,
"acc_stderr": 0.01388881628678211,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971451
},
"harness|hellaswag|10": {
"acc": 0.6718781119298944,
"acc_stderr": 0.004685698752104804,
"acc_norm": 0.8575980880302728,
"acc_norm_stderr": 0.003487476812280519
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911267,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8541666666666666,
"acc_stderr": 0.02951424596429177,
"acc_norm": 0.8541666666666666,
"acc_norm_stderr": 0.02951424596429177
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6468085106382979,
"acc_stderr": 0.031245325202761926,
"acc_norm": 0.6468085106382979,
"acc_norm_stderr": 0.031245325202761926
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604548,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604548
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.021417242936321582,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.021417242936321582
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6108374384236454,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.6108374384236454,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.0231193627582323,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.0231193627582323
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083025,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083025
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8733944954128441,
"acc_stderr": 0.014257128686165169,
"acc_norm": 0.8733944954128441,
"acc_norm_stderr": 0.014257128686165169
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.023405530480846315,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.023405530480846315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445795,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445795
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761012,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761012
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941642,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941642
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321624,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.01664330737231586,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.01664330737231586
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.020888690414093875,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.020888690414093875
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.02971928127223684,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.02971928127223684
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5097783572359843,
"acc_stderr": 0.012767793787729341,
"acc_norm": 0.5097783572359843,
"acc_norm_stderr": 0.012767793787729341
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7757352941176471,
"acc_stderr": 0.025336848563332386,
"acc_norm": 0.7757352941176471,
"acc_norm_stderr": 0.025336848563332386
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.017160587235046345,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.017160587235046345
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.025607375986579157,
"acc_norm": 0.8,
"acc_norm_stderr": 0.025607375986579157
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02410338420207286,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02410338420207286
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3708690330477356,
"mc1_stderr": 0.016909693580248814,
"mc2": 0.5375225566743017,
"mc2_stderr": 0.014992803866242046
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.5822592873388931,
"acc_stderr": 0.013584820638504816
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_chargoddard__mixtralmerge-8x7B-rebalanced-test | [
"region:us"
] | 2024-01-05T14:04:05+00:00 | {"pretty_name": "Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [chargoddard/mixtralmerge-8x7B-rebalanced-test](https://huggingface.co/chargoddard/mixtralmerge-8x7B-rebalanced-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__mixtralmerge-8x7B-rebalanced-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T14:01:45.060324](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__mixtralmerge-8x7B-rebalanced-test/blob/main/results_2024-01-05T14-01-45.060324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7031585372717768,\n \"acc_stderr\": 0.030406769723018957,\n \"acc_norm\": 0.7069430171554243,\n \"acc_norm_stderr\": 0.030996497825775828,\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248814,\n \"mc2\": 0.5375225566743017,\n \"mc2_stderr\": 0.014992803866242046\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6552901023890785,\n \"acc_stderr\": 0.01388881628678211,\n \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971451\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6718781119298944,\n \"acc_stderr\": 0.004685698752104804,\n \"acc_norm\": 0.8575980880302728,\n \"acc_norm_stderr\": 0.003487476812280519\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911267,\n \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8541666666666666,\n \"acc_stderr\": 0.02951424596429177,\n \"acc_norm\": 0.8541666666666666,\n \"acc_norm_stderr\": 0.02951424596429177\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6468085106382979,\n \"acc_stderr\": 0.031245325202761926,\n \"acc_norm\": 0.6468085106382979,\n \"acc_norm_stderr\": 0.031245325202761926\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n \"acc_stderr\": 0.021417242936321582,\n \"acc_norm\": 0.8290322580645161,\n \"acc_norm_stderr\": 0.021417242936321582\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6108374384236454,\n \"acc_stderr\": 0.034304624161038716,\n \"acc_norm\": 0.6108374384236454,\n \"acc_norm_stderr\": 0.034304624161038716\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.0231193627582323,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.0231193627582323\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083025,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083025\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8733944954128441,\n \"acc_stderr\": 0.014257128686165169,\n \"acc_norm\": 0.8733944954128441,\n \"acc_norm_stderr\": 0.014257128686165169\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8725490196078431,\n \"acc_stderr\": 0.023405530480846315,\n \"acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.023405530480846315\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445795,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445795\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761012,\n \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761012\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n \"acc_stderr\": 0.018724301741941642,\n \"acc_norm\": 0.9102564102564102,\n \"acc_norm_stderr\": 0.018724301741941642\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321624,\n \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321624\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n \"acc_stderr\": 0.01664330737231586,\n \"acc_norm\": 0.45139664804469276,\n \"acc_norm_stderr\": 0.01664330737231586\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.020888690414093875,\n \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.020888690414093875\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5425531914893617,\n \"acc_stderr\": 0.02971928127223684,\n \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.02971928127223684\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5097783572359843,\n \"acc_stderr\": 0.012767793787729341,\n \"acc_norm\": 0.5097783572359843,\n \"acc_norm_stderr\": 0.012767793787729341\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7757352941176471,\n \"acc_stderr\": 0.025336848563332386,\n \"acc_norm\": 0.7757352941176471,\n \"acc_norm_stderr\": 0.025336848563332386\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.017160587235046345,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.017160587235046345\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.025607375986579157,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.025607375986579157\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02410338420207286,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02410338420207286\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3708690330477356,\n \"mc1_stderr\": 0.016909693580248814,\n \"mc2\": 0.5375225566743017,\n \"mc2_stderr\": 0.014992803866242046\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5822592873388931,\n \"acc_stderr\": 0.013584820638504816\n }\n}\n```", "repo_url": "https://huggingface.co/chargoddard/mixtralmerge-8x7B-rebalanced-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-01-45.060324.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["**/details_harness|winogrande|5_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T14-01-45.060324.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T14_01_45.060324", "path": ["results_2024-01-05T14-01-45.060324.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T14-01-45.060324.parquet"]}]}]} | 2024-01-05T14:04:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test
Dataset automatically created during the evaluation run of model chargoddard/mixtralmerge-8x7B-rebalanced-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T14:01:45.060324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/mixtralmerge-8x7B-rebalanced-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:01:45.060324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/mixtralmerge-8x7B-rebalanced-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:01:45.060324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of chargoddard/mixtralmerge-8x7B-rebalanced-test\n\n\n\nDataset automatically created during the evaluation run of model chargoddard/mixtralmerge-8x7B-rebalanced-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T14:01:45.060324(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
080f13e4eed8b293559f0b0c83217d63e3cf3eb2 |
# Dataset Card for Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aabbhishekk/TinyLlama-1.1B-miniguanaco](https://huggingface.co/Aabbhishekk/TinyLlama-1.1B-miniguanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aabbhishekk__TinyLlama-1.1B-miniguanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T14:07:19.265466](https://huggingface.co/datasets/open-llm-leaderboard/details_Aabbhishekk__TinyLlama-1.1B-miniguanaco/blob/main/results_2024-01-05T14-07-19.265466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2685394448132341,
"acc_stderr": 0.03121517593778971,
"acc_norm": 0.2699066607109775,
"acc_norm_stderr": 0.031984669596033515,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3883720794300864,
"mc2_stderr": 0.014019345761957784
},
"harness|arc:challenge|25": {
"acc": 0.3412969283276451,
"acc_stderr": 0.01385583128749772,
"acc_norm": 0.3515358361774744,
"acc_norm_stderr": 0.013952413699600938
},
"harness|hellaswag|10": {
"acc": 0.45339573790081655,
"acc_stderr": 0.004968058944472161,
"acc_norm": 0.6025692093208525,
"acc_norm_stderr": 0.004883663587184764
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210324984,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210324984
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724064,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554857,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554857
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02924188386962882,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02924188386962882
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23225806451612904,
"acc_stderr": 0.02402225613030824,
"acc_norm": 0.23225806451612904,
"acc_norm_stderr": 0.02402225613030824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.0307127300709826,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.0307127300709826
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2846153846153846,
"acc_stderr": 0.022878322799706297,
"acc_norm": 0.2846153846153846,
"acc_norm_stderr": 0.022878322799706297
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.02620276653465215,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.02620276653465215
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868956,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23486238532110093,
"acc_stderr": 0.018175110510343578,
"acc_norm": 0.23486238532110093,
"acc_norm_stderr": 0.018175110510343578
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.02830465794303531,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.02830465794303531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.037276735755969195,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.037276735755969195
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690875,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690875
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749482,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749482
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.01598281477469563,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.01598281477469563
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.022894082489925992,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.022894082489925992
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225627,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879337,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879337
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.010844802669662692,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.010844802669662692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.029896163033125478,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.029896163033125478
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.01774089950917779,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.01774089950917779
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1510204081632653,
"acc_stderr": 0.022923004094736868,
"acc_norm": 0.1510204081632653,
"acc_norm_stderr": 0.022923004094736868
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683226,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683226
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3883720794300864,
"mc2_stderr": 0.014019345761957784
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873836
},
"harness|gsm8k|5": {
"acc": 0.014404852160727824,
"acc_stderr": 0.0032820559171369557
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Aabbhishekk__TinyLlama-1.1B-miniguanaco | [
"region:us"
] | 2024-01-05T14:09:09+00:00 | {"pretty_name": "Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [Aabbhishekk/TinyLlama-1.1B-miniguanaco](https://huggingface.co/Aabbhishekk/TinyLlama-1.1B-miniguanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aabbhishekk__TinyLlama-1.1B-miniguanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T14:07:19.265466](https://huggingface.co/datasets/open-llm-leaderboard/details_Aabbhishekk__TinyLlama-1.1B-miniguanaco/blob/main/results_2024-01-05T14-07-19.265466.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2685394448132341,\n \"acc_stderr\": 0.03121517593778971,\n \"acc_norm\": 0.2699066607109775,\n \"acc_norm_stderr\": 0.031984669596033515,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3883720794300864,\n \"mc2_stderr\": 0.014019345761957784\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3412969283276451,\n \"acc_stderr\": 0.01385583128749772,\n \"acc_norm\": 0.3515358361774744,\n \"acc_norm_stderr\": 0.013952413699600938\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45339573790081655,\n \"acc_stderr\": 0.004968058944472161,\n \"acc_norm\": 0.6025692093208525,\n \"acc_norm_stderr\": 0.004883663587184764\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210324984,\n \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210324984\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724064,\n \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724064\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n \"acc_stderr\": 0.03800968060554857,\n \"acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.03800968060554857\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02924188386962882,\n \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02924188386962882\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23225806451612904,\n \"acc_stderr\": 0.02402225613030824,\n \"acc_norm\": 0.23225806451612904,\n \"acc_norm_stderr\": 0.02402225613030824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.0307127300709826,\n \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.0307127300709826\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790486,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790486\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935411,\n \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935411\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.022878322799706297,\n \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.022878322799706297\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.02620276653465215,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.02620276653465215\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868956,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868956\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23486238532110093,\n \"acc_stderr\": 0.018175110510343578,\n \"acc_norm\": 0.23486238532110093,\n \"acc_norm_stderr\": 0.018175110510343578\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25316455696202533,\n \"acc_stderr\": 0.02830465794303531,\n \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.02830465794303531\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.037276735755969195,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.037276735755969195\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690875,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690875\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749482,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749482\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.01598281477469563,\n \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.01598281477469563\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.022894082489925992,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.022894082489925992\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225627,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225627\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879337,\n \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879337\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n \"acc_stderr\": 0.010844802669662692,\n \"acc_norm\": 0.23598435462842243,\n \"acc_norm_stderr\": 0.010844802669662692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.029896163033125478,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.029896163033125478\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.01774089950917779,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.01774089950917779\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1510204081632653,\n \"acc_stderr\": 0.022923004094736868,\n \"acc_norm\": 0.1510204081632653,\n \"acc_norm_stderr\": 0.022923004094736868\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683226,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683226\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.03301405946987249,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.03301405946987249\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3883720794300864,\n \"mc2_stderr\": 0.014019345761957784\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873836\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \"acc_stderr\": 0.0032820559171369557\n }\n}\n```", "repo_url": "https://huggingface.co/Aabbhishekk/TinyLlama-1.1B-miniguanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-07-19.265466.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["**/details_harness|winogrande|5_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T14-07-19.265466.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T14_07_19.265466", "path": ["results_2024-01-05T14-07-19.265466.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T14-07-19.265466.parquet"]}]}]} | 2024-01-05T14:09:31+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco
Dataset automatically created during the evaluation run of model Aabbhishekk/TinyLlama-1.1B-miniguanaco on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T14:07:19.265466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco\n\n\n\nDataset automatically created during the evaluation run of model Aabbhishekk/TinyLlama-1.1B-miniguanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:07:19.265466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco\n\n\n\nDataset automatically created during the evaluation run of model Aabbhishekk/TinyLlama-1.1B-miniguanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:07:19.265466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
197,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Aabbhishekk/TinyLlama-1.1B-miniguanaco\n\n\n\nDataset automatically created during the evaluation run of model Aabbhishekk/TinyLlama-1.1B-miniguanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T14:07:19.265466(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
eb3e4fe9793dad0fb9d663b21cc2c5c2d8b176da |
# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Deathsquad10/TinyLlama-1.1B-Remix-V.2](https://huggingface.co/Deathsquad10/TinyLlama-1.1B-Remix-V.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Deathsquad10__TinyLlama-1.1B-Remix-V.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T14:09:04.143664](https://huggingface.co/datasets/open-llm-leaderboard/details_Deathsquad10__TinyLlama-1.1B-Remix-V.2/blob/main/results_2024-01-05T14-09-04.143664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26429787979631164,
"acc_stderr": 0.031085584449339877,
"acc_norm": 0.2663316679650246,
"acc_norm_stderr": 0.031867914550128766,
"mc1": 0.204406364749082,
"mc1_stderr": 0.01411717433743261,
"mc2": 0.34643256618777796,
"mc2_stderr": 0.01387748248118174
},
"harness|arc:challenge|25": {
"acc": 0.2935153583617747,
"acc_stderr": 0.013307250444941118,
"acc_norm": 0.3319112627986348,
"acc_norm_stderr": 0.013760988200880536
},
"harness|hellaswag|10": {
"acc": 0.42322246564429394,
"acc_stderr": 0.0049306030615906445,
"acc_norm": 0.5662218681537542,
"acc_norm_stderr": 0.004945824056501808
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362466,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362466
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.03036358219723816,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.03036358219723816
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512321984,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512321984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.031618563353586114,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.031618563353586114
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03453131801885416,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03453131801885416
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.19696969696969696,
"acc_stderr": 0.02833560973246335,
"acc_norm": 0.19696969696969696,
"acc_norm_stderr": 0.02833560973246335
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006165,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006165
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02696242432507383,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02696242432507383
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275886,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275886
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23119266055045873,
"acc_stderr": 0.01807575024163315,
"acc_norm": 0.23119266055045873,
"acc_norm_stderr": 0.01807575024163315
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.02856165010242224,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.02856165010242224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2320675105485232,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.2320675105485232,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.0372767357559692,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.0372767357559692
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.01613917409652258,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.01613917409652258
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.30346820809248554,
"acc_stderr": 0.02475241196091721,
"acc_norm": 0.30346820809248554,
"acc_norm_stderr": 0.02475241196091721
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694888,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2716049382716049,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.2716049382716049,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880592,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880592
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313395,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313395
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.30514705882352944,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.30514705882352944,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913226,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.02580128347509051,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.02580128347509051
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.204406364749082,
"mc1_stderr": 0.01411717433743261,
"mc2": 0.34643256618777796,
"mc2_stderr": 0.01387748248118174
},
"harness|winogrande|5": {
"acc": 0.5808997632202052,
"acc_stderr": 0.013867325192210112
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.0026153265107756716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Deathsquad10__TinyLlama-1.1B-Remix-V.2 | [
"region:us"
] | 2024-01-05T14:10:53+00:00 | {"pretty_name": "Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Deathsquad10/TinyLlama-1.1B-Remix-V.2](https://huggingface.co/Deathsquad10/TinyLlama-1.1B-Remix-V.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Deathsquad10__TinyLlama-1.1B-Remix-V.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T14:09:04.143664](https://huggingface.co/datasets/open-llm-leaderboard/details_Deathsquad10__TinyLlama-1.1B-Remix-V.2/blob/main/results_2024-01-05T14-09-04.143664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26429787979631164,\n \"acc_stderr\": 0.031085584449339877,\n \"acc_norm\": 0.2663316679650246,\n \"acc_norm_stderr\": 0.031867914550128766,\n \"mc1\": 0.204406364749082,\n \"mc1_stderr\": 0.01411717433743261,\n \"mc2\": 0.34643256618777796,\n \"mc2_stderr\": 0.01387748248118174\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2935153583617747,\n \"acc_stderr\": 0.013307250444941118,\n \"acc_norm\": 0.3319112627986348,\n \"acc_norm_stderr\": 0.013760988200880536\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42322246564429394,\n \"acc_stderr\": 0.0049306030615906445,\n \"acc_norm\": 0.5662218681537542,\n \"acc_norm_stderr\": 0.004945824056501808\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.03391160934343604,\n \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.03391160934343604\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.03036358219723816,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.03036358219723816\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512321984,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512321984\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.031618563353586114,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.031618563353586114\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.03453131801885416,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.03453131801885416\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.19696969696969696,\n \"acc_stderr\": 0.02833560973246335,\n \"acc_norm\": 0.19696969696969696,\n \"acc_norm_stderr\": 0.02833560973246335\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006165,\n \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006165\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02696242432507383,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02696242432507383\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275886,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275886\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23119266055045873,\n \"acc_stderr\": 0.01807575024163315,\n \"acc_norm\": 0.23119266055045873,\n \"acc_norm_stderr\": 0.01807575024163315\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.22685185185185186,\n \"acc_stderr\": 0.02856165010242224,\n \"acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.02856165010242224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2320675105485232,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.2320675105485232,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3140495867768595,\n \"acc_stderr\": 0.04236964753041018,\n \"acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.04236964753041018\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n \"acc_stderr\": 0.01613917409652258,\n \"acc_norm\": 0.2848020434227331,\n \"acc_norm_stderr\": 0.01613917409652258\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.30346820809248554,\n \"acc_stderr\": 0.02475241196091721,\n \"acc_norm\": 0.30346820809248554,\n \"acc_norm_stderr\": 0.02475241196091721\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n \"acc_stderr\": 0.025494259350694888,\n \"acc_norm\": 0.2797427652733119,\n \"acc_norm_stderr\": 0.025494259350694888\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2716049382716049,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.2716049382716049,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880592,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880592\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n \"acc_stderr\": 0.011176923719313395,\n \"acc_norm\": 0.258148631029987,\n \"acc_norm_stderr\": 0.011176923719313395\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.30514705882352944,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.30514705882352944,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913226,\n \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913226\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.02580128347509051,\n \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.02580128347509051\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.204406364749082,\n \"mc1_stderr\": 0.01411717433743261,\n \"mc2\": 0.34643256618777796,\n \"mc2_stderr\": 0.01387748248118174\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5808997632202052,\n \"acc_stderr\": 0.013867325192210112\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.0026153265107756716\n }\n}\n```", "repo_url": "https://huggingface.co/Deathsquad10/TinyLlama-1.1B-Remix-V.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-09-04.143664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["**/details_harness|winogrande|5_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T14-09-04.143664.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T14_09_04.143664", "path": ["results_2024-01-05T14-09-04.143664.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T14-09-04.143664.parquet"]}]}]} | 2024-01-05T14:11:18+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2
Dataset automatically created during the evaluation run of model Deathsquad10/TinyLlama-1.1B-Remix-V.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T14:09:04.143664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2\n\n\n\nDataset automatically created during the evaluation run of model Deathsquad10/TinyLlama-1.1B-Remix-V.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:09:04.143664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2\n\n\n\nDataset automatically created during the evaluation run of model Deathsquad10/TinyLlama-1.1B-Remix-V.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:09:04.143664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Deathsquad10/TinyLlama-1.1B-Remix-V.2\n\n\n\nDataset automatically created during the evaluation run of model Deathsquad10/TinyLlama-1.1B-Remix-V.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T14:09:04.143664(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
374f9e00bd21f6840c1a77783bc31c2d55e78c80 |
# Dataset Card for Midjourney Art Prompts Dataset
## Dataset Details
### Dataset Description
The Midjourney Art Prompts Dataset is a meticulously curated collection of prompts designed for training or fine-tuning Language Model Prompt Generators. Each prompt is crafted to evoke specific artistic styles, themes, and details, providing a diverse and comprehensive set for creative language model outputs.
- **Curated by:** Mohammed Shojaei
- **License:** [Apache-2.0](https://www.apache.org/licenses/LICENSE-2.0)
### Dataset Sources
- **Repository:** [https://github.com/mshojaei77/Prompt-Scraper]
## Uses
### Direct Use
The dataset is intended for use in training or fine-tuning Language Models, particularly those focused on generating artistic prompts. It provides a rich set of subjects and styles to enhance the creative capabilities of prompt-based language models.
### Out-of-Scope Use
Misuse, malicious use, and uses not suitable for generating creative prompts are considered out-of-scope for this dataset.
## Dataset Structure
The dataset is provided in CSV format with two main columns:
- **subject**: Describes the subject matter of the prompt.
- **prompt**: Provides detailed information about the artistic style, elements, and specifications for the prompt.
## Dataset Creation
### Curation Rationale
The dataset is created to support the development and enhancement of Language Model Prompt Generators, specifically in the domain of artistic prompts.
### Source Data
#### Data Collection and Processing
The prompts are curated with a focus on diverse artistic styles and themes. Data collection involves selecting prompts that showcase varied subjects and detailed specifications.
#### Who are the source data producers?
The prompts are sourced from various artistic contexts and contributors mentioned in the dataset.
### Annotations [optional]
This dataset does not involve additional annotations beyond the provided prompts.
#### Personal and Sensitive Information
The dataset does not contain personal, sensitive, or private information.
## Bias, Risks, and Limitations
The prompts are curated to be diverse, but biases may exist in the selection process. The dataset might not cover every possible artistic style, and users should be aware of potential limitations in representation.
### Recommendations
Users should consider the dataset's limitations and supplement it with additional sources for a more comprehensive understanding of artistic prompts.
## Citation [optional]
**BibTeX:**
[Provide BibTeX Information]
**APA:**
[Provide APA Information]
## Glossary [optional]
[No specific terms or calculations in this context]
## More Information [optional]
[Any additional information you'd like to provide]
## Dataset Card Authors [optional]
[Your Name or Organization]
## Dataset Card Contact
[Your Contact Information]
| mshojaei77/Midjourney-Art-Prompts | [
"license:apache-2.0",
"region:us"
] | 2024-01-05T14:17:17+00:00 | {"license": "apache-2.0"} | 2024-01-05T15:19:07+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# Dataset Card for Midjourney Art Prompts Dataset
## Dataset Details
### Dataset Description
The Midjourney Art Prompts Dataset is a meticulously curated collection of prompts designed for training or fine-tuning Language Model Prompt Generators. Each prompt is crafted to evoke specific artistic styles, themes, and details, providing a diverse and comprehensive set for creative language model outputs.
- Curated by: Mohammed Shojaei
- License: Apache-2.0
### Dataset Sources
- Repository: [URL
## Uses
### Direct Use
The dataset is intended for use in training or fine-tuning Language Models, particularly those focused on generating artistic prompts. It provides a rich set of subjects and styles to enhance the creative capabilities of prompt-based language models.
### Out-of-Scope Use
Misuse, malicious use, and uses not suitable for generating creative prompts are considered out-of-scope for this dataset.
## Dataset Structure
The dataset is provided in CSV format with two main columns:
- subject: Describes the subject matter of the prompt.
- prompt: Provides detailed information about the artistic style, elements, and specifications for the prompt.
## Dataset Creation
### Curation Rationale
The dataset is created to support the development and enhancement of Language Model Prompt Generators, specifically in the domain of artistic prompts.
### Source Data
#### Data Collection and Processing
The prompts are curated with a focus on diverse artistic styles and themes. Data collection involves selecting prompts that showcase varied subjects and detailed specifications.
#### Who are the source data producers?
The prompts are sourced from various artistic contexts and contributors mentioned in the dataset.
### Annotations [optional]
This dataset does not involve additional annotations beyond the provided prompts.
#### Personal and Sensitive Information
The dataset does not contain personal, sensitive, or private information.
## Bias, Risks, and Limitations
The prompts are curated to be diverse, but biases may exist in the selection process. The dataset might not cover every possible artistic style, and users should be aware of potential limitations in representation.
### Recommendations
Users should consider the dataset's limitations and supplement it with additional sources for a more comprehensive understanding of artistic prompts.
[optional]
BibTeX:
[Provide BibTeX Information]
APA:
[Provide APA Information]
## Glossary [optional]
[No specific terms or calculations in this context]
## More Information [optional]
[Any additional information you'd like to provide]
## Dataset Card Authors [optional]
[Your Name or Organization]
## Dataset Card Contact
[Your Contact Information]
| [
"# Dataset Card for Midjourney Art Prompts Dataset",
"## Dataset Details",
"### Dataset Description\n\nThe Midjourney Art Prompts Dataset is a meticulously curated collection of prompts designed for training or fine-tuning Language Model Prompt Generators. Each prompt is crafted to evoke specific artistic styles, themes, and details, providing a diverse and comprehensive set for creative language model outputs.\n\n- Curated by: Mohammed Shojaei \n- License: Apache-2.0",
"### Dataset Sources\n\n- Repository: [URL",
"## Uses",
"### Direct Use\n\nThe dataset is intended for use in training or fine-tuning Language Models, particularly those focused on generating artistic prompts. It provides a rich set of subjects and styles to enhance the creative capabilities of prompt-based language models.",
"### Out-of-Scope Use\n\nMisuse, malicious use, and uses not suitable for generating creative prompts are considered out-of-scope for this dataset.",
"## Dataset Structure\n\nThe dataset is provided in CSV format with two main columns:\n\n- subject: Describes the subject matter of the prompt.\n- prompt: Provides detailed information about the artistic style, elements, and specifications for the prompt.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset is created to support the development and enhancement of Language Model Prompt Generators, specifically in the domain of artistic prompts.",
"### Source Data",
"#### Data Collection and Processing\n\nThe prompts are curated with a focus on diverse artistic styles and themes. Data collection involves selecting prompts that showcase varied subjects and detailed specifications.",
"#### Who are the source data producers?\n\nThe prompts are sourced from various artistic contexts and contributors mentioned in the dataset.",
"### Annotations [optional]\n\nThis dataset does not involve additional annotations beyond the provided prompts.",
"#### Personal and Sensitive Information\n\nThe dataset does not contain personal, sensitive, or private information.",
"## Bias, Risks, and Limitations\n\nThe prompts are curated to be diverse, but biases may exist in the selection process. The dataset might not cover every possible artistic style, and users should be aware of potential limitations in representation.",
"### Recommendations\n\nUsers should consider the dataset's limitations and supplement it with additional sources for a more comprehensive understanding of artistic prompts.\n\n[optional]\n\nBibTeX:\n\n[Provide BibTeX Information]\n\nAPA:\n\n[Provide APA Information]",
"## Glossary [optional]\n\n[No specific terms or calculations in this context]",
"## More Information [optional]\n\n[Any additional information you'd like to provide]",
"## Dataset Card Authors [optional]\n\n[Your Name or Organization]",
"## Dataset Card Contact\n\n[Your Contact Information]"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Dataset Card for Midjourney Art Prompts Dataset",
"## Dataset Details",
"### Dataset Description\n\nThe Midjourney Art Prompts Dataset is a meticulously curated collection of prompts designed for training or fine-tuning Language Model Prompt Generators. Each prompt is crafted to evoke specific artistic styles, themes, and details, providing a diverse and comprehensive set for creative language model outputs.\n\n- Curated by: Mohammed Shojaei \n- License: Apache-2.0",
"### Dataset Sources\n\n- Repository: [URL",
"## Uses",
"### Direct Use\n\nThe dataset is intended for use in training or fine-tuning Language Models, particularly those focused on generating artistic prompts. It provides a rich set of subjects and styles to enhance the creative capabilities of prompt-based language models.",
"### Out-of-Scope Use\n\nMisuse, malicious use, and uses not suitable for generating creative prompts are considered out-of-scope for this dataset.",
"## Dataset Structure\n\nThe dataset is provided in CSV format with two main columns:\n\n- subject: Describes the subject matter of the prompt.\n- prompt: Provides detailed information about the artistic style, elements, and specifications for the prompt.",
"## Dataset Creation",
"### Curation Rationale\n\nThe dataset is created to support the development and enhancement of Language Model Prompt Generators, specifically in the domain of artistic prompts.",
"### Source Data",
"#### Data Collection and Processing\n\nThe prompts are curated with a focus on diverse artistic styles and themes. Data collection involves selecting prompts that showcase varied subjects and detailed specifications.",
"#### Who are the source data producers?\n\nThe prompts are sourced from various artistic contexts and contributors mentioned in the dataset.",
"### Annotations [optional]\n\nThis dataset does not involve additional annotations beyond the provided prompts.",
"#### Personal and Sensitive Information\n\nThe dataset does not contain personal, sensitive, or private information.",
"## Bias, Risks, and Limitations\n\nThe prompts are curated to be diverse, but biases may exist in the selection process. The dataset might not cover every possible artistic style, and users should be aware of potential limitations in representation.",
"### Recommendations\n\nUsers should consider the dataset's limitations and supplement it with additional sources for a more comprehensive understanding of artistic prompts.\n\n[optional]\n\nBibTeX:\n\n[Provide BibTeX Information]\n\nAPA:\n\n[Provide APA Information]",
"## Glossary [optional]\n\n[No specific terms or calculations in this context]",
"## More Information [optional]\n\n[Any additional information you'd like to provide]",
"## Dataset Card Authors [optional]\n\n[Your Name or Organization]",
"## Dataset Card Contact\n\n[Your Contact Information]"
] | [
14,
14,
4,
90,
13,
3,
56,
39,
56,
5,
37,
4,
45,
30,
25,
22,
55,
59,
19,
19,
17,
11
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Dataset Card for Midjourney Art Prompts Dataset## Dataset Details### Dataset Description\n\nThe Midjourney Art Prompts Dataset is a meticulously curated collection of prompts designed for training or fine-tuning Language Model Prompt Generators. Each prompt is crafted to evoke specific artistic styles, themes, and details, providing a diverse and comprehensive set for creative language model outputs.\n\n- Curated by: Mohammed Shojaei \n- License: Apache-2.0### Dataset Sources\n\n- Repository: [URL## Uses### Direct Use\n\nThe dataset is intended for use in training or fine-tuning Language Models, particularly those focused on generating artistic prompts. It provides a rich set of subjects and styles to enhance the creative capabilities of prompt-based language models.### Out-of-Scope Use\n\nMisuse, malicious use, and uses not suitable for generating creative prompts are considered out-of-scope for this dataset.## Dataset Structure\n\nThe dataset is provided in CSV format with two main columns:\n\n- subject: Describes the subject matter of the prompt.\n- prompt: Provides detailed information about the artistic style, elements, and specifications for the prompt.## Dataset Creation### Curation Rationale\n\nThe dataset is created to support the development and enhancement of Language Model Prompt Generators, specifically in the domain of artistic prompts.### Source Data#### Data Collection and Processing\n\nThe prompts are curated with a focus on diverse artistic styles and themes. Data collection involves selecting prompts that showcase varied subjects and detailed specifications.#### Who are the source data producers?\n\nThe prompts are sourced from various artistic contexts and contributors mentioned in the dataset.### Annotations [optional]\n\nThis dataset does not involve additional annotations beyond the provided prompts.#### Personal and Sensitive Information\n\nThe dataset does not contain personal, sensitive, or private information."
] |
d4e7d5e764aa5d5fc02b400e49ccfe062b5777ad | # Dataset Card for "araproje_hellaswag_tr_conf2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_tr_conf2 | [
"region:us"
] | 2024-01-05T14:31:23+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 162703.0, "num_examples": 250}], "download_size": 86220, "dataset_size": 162703.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T15:34:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_hellaswag_tr_conf2"
More Information needed | [
"# Dataset Card for \"araproje_hellaswag_tr_conf2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_hellaswag_tr_conf2\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_hellaswag_tr_conf2\"\n\nMore Information needed"
] |
019f6245734f2a11cb0f2183cb5d0cc495dfb0e0 |
# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T14:51:30.669474](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2024-01-05T14-51-30.669474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6379672317121475,
"acc_stderr": 0.032268482670470874,
"acc_norm": 0.6378521186236827,
"acc_norm_stderr": 0.0329317188618121,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5644227760342831,
"mc2_stderr": 0.015511434380507188
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.01410457836649189,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6676956781517626,
"acc_stderr": 0.004700767741735563,
"acc_norm": 0.8511252738498307,
"acc_norm_stderr": 0.0035523745313052004
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113115,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113115
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501534,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501534
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.01358661921990333,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.01358661921990333
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861674,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861674
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598557,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598557
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169143,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169143
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.017177276822584284,
"mc2": 0.5644227760342831,
"mc2_stderr": 0.015511434380507188
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881578
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp | [
"region:us"
] | 2024-01-05T14:53:49+00:00 | {"pretty_name": "Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/MetaMath-Tulpar-7b-v2-Slerp](https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T14:51:30.669474](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__MetaMath-Tulpar-7b-v2-Slerp/blob/main/results_2024-01-05T14-51-30.669474.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6379672317121475,\n \"acc_stderr\": 0.032268482670470874,\n \"acc_norm\": 0.6378521186236827,\n \"acc_norm_stderr\": 0.0329317188618121,\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5644227760342831,\n \"mc2_stderr\": 0.015511434380507188\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.01410457836649189,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6676956781517626,\n \"acc_stderr\": 0.004700767741735563,\n \"acc_norm\": 0.8511252738498307,\n \"acc_norm_stderr\": 0.0035523745313052004\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113115,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113115\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501534,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501534\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n \"acc_stderr\": 0.01358661921990333,\n \"acc_norm\": 0.8250319284802043,\n \"acc_norm_stderr\": 0.01358661921990333\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n \"acc_stderr\": 0.016501579306861674,\n \"acc_norm\": 0.41899441340782123,\n \"acc_norm_stderr\": 0.016501579306861674\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757485,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757485\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n \"acc_stderr\": 0.012729785386598557,\n \"acc_norm\": 0.4602346805736636,\n \"acc_norm_stderr\": 0.012729785386598557\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169143,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169143\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n \"mc1_stderr\": 0.017177276822584284,\n \"mc2\": 0.5644227760342831,\n \"mc2_stderr\": 0.015511434380507188\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881578\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \"acc_stderr\": 0.012570068947898767\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/MetaMath-Tulpar-7b-v2-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["**/details_harness|winogrande|5_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T14-51-30.669474.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T14_51_30.669474", "path": ["results_2024-01-05T14-51-30.669474.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T14-51-30.669474.parquet"]}]}]} | 2024-01-05T14:54:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp
Dataset automatically created during the evaluation run of model Weyaxi/MetaMath-Tulpar-7b-v2-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T14:51:30.669474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-Tulpar-7b-v2-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:51:30.669474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-Tulpar-7b-v2-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T14:51:30.669474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/MetaMath-Tulpar-7b-v2-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/MetaMath-Tulpar-7b-v2-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T14:51:30.669474(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
cdca697b0911bf6799ab92df44800de214cf1eb2 | # Dataset Card for "asr-dummy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bofenghuang/asr-dummy | [
"region:us"
] | 2024-01-05T14:58:33+00:00 | {"dataset_info": {"config_name": "fr", "features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "text", "dtype": "string"}, {"name": "duration", "dtype": "float64"}, {"name": "split", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 35664300.82401466, "num_examples": 120}], "download_size": 0, "dataset_size": 35664300.82401466}, "configs": [{"config_name": "fr", "data_files": [{"split": "test", "path": "fr/test-*"}]}]} | 2024-01-05T15:00:12+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "asr-dummy"
More Information needed | [
"# Dataset Card for \"asr-dummy\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"asr-dummy\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"asr-dummy\"\n\nMore Information needed"
] |
6a2d25012c1b6755a9e32ab185b16f8f95fa2a5c | # Dataset Card for "pairwise_classification_synthetic_gpt4_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Xapien/pairwise_classification_synthetic_gpt4_10k | [
"region:us"
] | 2024-01-05T15:04:55+00:00 | {"dataset_info": {"features": [{"name": "name", "dtype": "string"}, {"name": "summary_a", "dtype": "string"}, {"name": "same_entity_summary", "dtype": "string"}, {"name": "different_entity_summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3650249, "num_examples": 9755}], "download_size": 1695269, "dataset_size": 3650249}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-05T15:04:57+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "pairwise_classification_synthetic_gpt4_10k"
More Information needed | [
"# Dataset Card for \"pairwise_classification_synthetic_gpt4_10k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"pairwise_classification_synthetic_gpt4_10k\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"pairwise_classification_synthetic_gpt4_10k\"\n\nMore Information needed"
] |
755a74bebdbb3d04c4e2d6db1c4f5b4f0e367e61 |
# Dataset Card for Evaluation run of TencentARC/LLaMA-Pro-8B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TencentARC/LLaMA-Pro-8B](https://huggingface.co/TencentARC/LLaMA-Pro-8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TencentARC__LLaMA-Pro-8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T15:06:36.564331](https://huggingface.co/datasets/open-llm-leaderboard/details_TencentARC__LLaMA-Pro-8B/blob/main/results_2024-01-05T15-06-36.564331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4764197375438744,
"acc_stderr": 0.034552955932007474,
"acc_norm": 0.48115021504516975,
"acc_norm_stderr": 0.035323141272306104,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.38859489598867014,
"mc2_stderr": 0.013678861072074354
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.014570144495075578
},
"harness|hellaswag|10": {
"acc": 0.578868751244772,
"acc_stderr": 0.004927314729433552,
"acc_norm": 0.7791276638119896,
"acc_norm_stderr": 0.0041398679751162995
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49433962264150944,
"acc_stderr": 0.030770900763851302,
"acc_norm": 0.49433962264150944,
"acc_norm_stderr": 0.030770900763851302
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.42127659574468085,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.42127659574468085,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489364,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489364
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.03852273364924316,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.03852273364924316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.028292056830112728,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.028292056830112728
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5878787878787879,
"acc_stderr": 0.03843566993588717,
"acc_norm": 0.5878787878787879,
"acc_norm_stderr": 0.03843566993588717
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.0356071651653106,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.0356071651653106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4369747899159664,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.4369747899159664,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6073394495412844,
"acc_stderr": 0.020937505161201096,
"acc_norm": 0.6073394495412844,
"acc_norm_stderr": 0.020937505161201096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686186,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686186
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.034956245220154766,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.034956245220154766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891183,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891183
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.644955300127714,
"acc_stderr": 0.01711208577277299,
"acc_norm": 0.644955300127714,
"acc_norm_stderr": 0.01711208577277299
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.02690290045866664,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.02690290045866664
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553974,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553974
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626595,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626595
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5627009646302251,
"acc_stderr": 0.028173917761762906,
"acc_norm": 0.5627009646302251,
"acc_norm_stderr": 0.028173917761762906
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49691358024691357,
"acc_stderr": 0.027820214158594384,
"acc_norm": 0.49691358024691357,
"acc_norm_stderr": 0.027820214158594384
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.02840662780959095,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.02840662780959095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3350717079530639,
"acc_stderr": 0.01205549947133037,
"acc_norm": 0.3350717079530639,
"acc_norm_stderr": 0.01205549947133037
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.03030625772246832,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.03030625772246832
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5265306122448979,
"acc_stderr": 0.03196412734523272,
"acc_norm": 0.5265306122448979,
"acc_norm_stderr": 0.03196412734523272
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6567164179104478,
"acc_stderr": 0.03357379665433431,
"acc_norm": 0.6567164179104478,
"acc_norm_stderr": 0.03357379665433431
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.38859489598867014,
"mc2_stderr": 0.013678861072074354
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972392
},
"harness|gsm8k|5": {
"acc": 0.17816527672479152,
"acc_stderr": 0.01054013252754946
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_TencentARC__LLaMA-Pro-8B | [
"region:us"
] | 2024-01-05T15:05:20+00:00 | {"pretty_name": "Evaluation run of TencentARC/LLaMA-Pro-8B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TencentARC/LLaMA-Pro-8B](https://huggingface.co/TencentARC/LLaMA-Pro-8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TencentARC__LLaMA-Pro-8B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T15:06:36.564331](https://huggingface.co/datasets/open-llm-leaderboard/details_TencentARC__LLaMA-Pro-8B/blob/main/results_2024-01-05T15-06-36.564331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4764197375438744,\n \"acc_stderr\": 0.034552955932007474,\n \"acc_norm\": 0.48115021504516975,\n \"acc_norm_stderr\": 0.035323141272306104,\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.38859489598867014,\n \"mc2_stderr\": 0.013678861072074354\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075578\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.578868751244772,\n \"acc_stderr\": 0.004927314729433552,\n \"acc_norm\": 0.7791276638119896,\n \"acc_norm_stderr\": 0.0041398679751162995\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.49433962264150944,\n \"acc_stderr\": 0.030770900763851302,\n \"acc_norm\": 0.49433962264150944,\n \"acc_norm_stderr\": 0.030770900763851302\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.42127659574468085,\n \"acc_stderr\": 0.03227834510146268,\n \"acc_norm\": 0.42127659574468085,\n \"acc_norm_stderr\": 0.03227834510146268\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489364,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489364\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n \"acc_stderr\": 0.03852273364924316,\n \"acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.03852273364924316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5516129032258065,\n \"acc_stderr\": 0.028292056830112728,\n \"acc_norm\": 0.5516129032258065,\n \"acc_norm_stderr\": 0.028292056830112728\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5878787878787879,\n \"acc_stderr\": 0.03843566993588717,\n \"acc_norm\": 0.5878787878787879,\n \"acc_norm_stderr\": 0.03843566993588717\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.0356071651653106,\n \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.0356071651653106\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4369747899159664,\n \"acc_stderr\": 0.03221943636566196,\n \"acc_norm\": 0.4369747899159664,\n \"acc_norm_stderr\": 0.03221943636566196\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6073394495412844,\n \"acc_stderr\": 0.020937505161201096,\n \"acc_norm\": 0.6073394495412844,\n \"acc_norm_stderr\": 0.020937505161201096\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686186,\n \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686186\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.034956245220154766,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.034956245220154766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5991561181434599,\n \"acc_stderr\": 0.031900803894732356,\n \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.031900803894732356\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.029872577708891183,\n \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.029872577708891183\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.644955300127714,\n \"acc_stderr\": 0.01711208577277299,\n \"acc_norm\": 0.644955300127714,\n \"acc_norm_stderr\": 0.01711208577277299\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.02690290045866664,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.02690290045866664\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n \"acc_stderr\": 0.014508979453553974,\n \"acc_norm\": 0.25139664804469275,\n \"acc_norm_stderr\": 0.014508979453553974\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626595,\n \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626595\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5627009646302251,\n \"acc_stderr\": 0.028173917761762906,\n \"acc_norm\": 0.5627009646302251,\n \"acc_norm_stderr\": 0.028173917761762906\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.49691358024691357,\n \"acc_stderr\": 0.027820214158594384,\n \"acc_norm\": 0.49691358024691357,\n \"acc_norm_stderr\": 0.027820214158594384\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.02840662780959095,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.02840662780959095\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3350717079530639,\n \"acc_stderr\": 0.01205549947133037,\n \"acc_norm\": 0.3350717079530639,\n \"acc_norm_stderr\": 0.01205549947133037\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.03030625772246832,\n \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.03030625772246832\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5265306122448979,\n \"acc_stderr\": 0.03196412734523272,\n \"acc_norm\": 0.5265306122448979,\n \"acc_norm_stderr\": 0.03196412734523272\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6567164179104478,\n \"acc_stderr\": 0.03357379665433431,\n \"acc_norm\": 0.6567164179104478,\n \"acc_norm_stderr\": 0.03357379665433431\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.38859489598867014,\n \"mc2_stderr\": 0.013678861072074354\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972392\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17816527672479152,\n \"acc_stderr\": 0.01054013252754946\n }\n}\n```", "repo_url": "https://huggingface.co/TencentARC/LLaMA-Pro-8B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-02-58.344432.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-06-36.564331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["**/details_harness|winogrande|5_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["**/details_harness|winogrande|5_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T15-06-36.564331.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T15_02_58.344432", "path": ["results_2024-01-05T15-02-58.344432.parquet"]}, {"split": "2024_01_05T15_06_36.564331", "path": ["results_2024-01-05T15-06-36.564331.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T15-06-36.564331.parquet"]}]}]} | 2024-01-05T15:09:00+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of TencentARC/LLaMA-Pro-8B
Dataset automatically created during the evaluation run of model TencentARC/LLaMA-Pro-8B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T15:06:36.564331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of TencentARC/LLaMA-Pro-8B\n\n\n\nDataset automatically created during the evaluation run of model TencentARC/LLaMA-Pro-8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:06:36.564331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TencentARC/LLaMA-Pro-8B\n\n\n\nDataset automatically created during the evaluation run of model TencentARC/LLaMA-Pro-8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:06:36.564331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TencentARC/LLaMA-Pro-8B\n\n\n\nDataset automatically created during the evaluation run of model TencentARC/LLaMA-Pro-8B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T15:06:36.564331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
369c4c5db6429f7c66677ed9b77dcd4a95792ac0 |
# Natural Instructions v2 Winogrande Tasks
- Project: https://github.com/allenai/natural-instructions
- Data source: [DataProvenanceInitiative/niv2_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/niv2_submix_original)
## Details
This dataset contains all Winogrande examples that were included in the [Flan 2022 collection](https://github.com/google-research/FLAN/tree/main/flan/v2) which were orignally published in Super-Natural-Instructions.
The data is copied from the preprocessed Natural Instructions v2 dataset at [DataProvenanceInitiative/niv2_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/niv2_submix_original).
These tasks are:
1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.
2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.
3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.
4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.
5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.
6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.
7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.
8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.
### Fields
- `inputs`: a `string` feature.
- `targets`: a `string` feature.
- `task_source`: a `string` feature.
- `task_name`: a `string` feature.
- `template_type`: a `string` feature.
## Citation
```
@inproceedings{wang-etal-2022-super,
title = "Super-{N}atural{I}nstructions: Generalization via Declarative Instructions on 1600+ {NLP} Tasks",
author = "Wang, Yizhong and
Mishra, Swaroop and
Alipoormolabashi, Pegah and
Kordi, Yeganeh and
Mirzaei, Amirreza and
Naik, Atharva and
Ashok, Arjun and
Dhanasekaran, Arut Selvan and
Arunkumar, Anjana and
Stap, David and
Pathak, Eshaan and
Karamanolakis, Giannis and
Lai, Haizhi and
Purohit, Ishan and
Mondal, Ishani and
Anderson, Jacob and
Kuznia, Kirby and
Doshi, Krima and
Pal, Kuntal Kumar and
Patel, Maitreya and
Moradshahi, Mehrad and
Parmar, Mihir and
Purohit, Mirali and
Varshney, Neeraj and
Kaza, Phani Rohitha and
Verma, Pulkit and
Puri, Ravsehaj Singh and
Karia, Rushang and
Doshi, Savan and
Sampat, Shailaja Keyur and
Mishra, Siddhartha and
Reddy A, Sujan and
Patro, Sumanta and
Dixit, Tanay and
Shen, Xudong",
editor = "Goldberg, Yoav and
Kozareva, Zornitsa and
Zhang, Yue",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.340",
doi = "10.18653/v1/2022.emnlp-main.340",
pages = "5085--5109",
abstract = "How well can NLP models generalize to a variety of unseen tasks when provided with task instructions? To address this question, we first introduce Super-NaturalInstructions, a benchmark of 1,616 diverse NLP tasks and their expert-written instructions. Our collection covers 76 distinct task types, including but not limited to classification, extraction, infilling, sequence tagging, text rewriting, and text composition. This large and diverse collection of tasks enables rigorous benchmarking of cross-task generalization under instructions{---}training models to follow instructions on a subset of tasks and evaluating them on the remaining unseen ones. Furthermore, we build Tk-Instruct, a transformer model trained to follow a variety of in-context instructions (plain language task definitions or k-shot examples). Our experiments show that Tk-Instruct outperforms existing instruction-following models such as InstructGPT by over 9{\%} on our benchmark despite being an order of magnitude smaller. We further analyze generalization as a function of various scaling parameters, such as the number of observed tasks, the number of instances per task, and model sizes. We hope our dataset and model facilitate future progress towards more general-purpose NLP models.",
}
``` | coref-data/niv2_winogrande_raw | [
"license:apache-2.0",
"region:us"
] | 2024-01-05T15:17:14+00:00 | {"license": "apache-2.0"} | 2024-01-19T00:03:42+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
# Natural Instructions v2 Winogrande Tasks
- Project: URL
- Data source: DataProvenanceInitiative/niv2_submix_original
## Details
This dataset contains all Winogrande examples that were included in the Flan 2022 collection which were orignally published in Super-Natural-Instructions.
The data is copied from the preprocessed Natural Instructions v2 dataset at DataProvenanceInitiative/niv2_submix_original.
These tasks are:
1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.
2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.
3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.
4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.
5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.
6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.
7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.
8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.
### Fields
- 'inputs': a 'string' feature.
- 'targets': a 'string' feature.
- 'task_source': a 'string' feature.
- 'task_name': a 'string' feature.
- 'template_type': a 'string' feature.
| [
"# Natural Instructions v2 Winogrande Tasks\n\n- Project: URL\n- Data source: DataProvenanceInitiative/niv2_submix_original",
"## Details\n\nThis dataset contains all Winogrande examples that were included in the Flan 2022 collection which were orignally published in Super-Natural-Instructions.\n\nThe data is copied from the preprocessed Natural Instructions v2 dataset at DataProvenanceInitiative/niv2_submix_original.\n\nThese tasks are:\n1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.\t\n2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.\t\n3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.\t\n4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.\t\n5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.\t\n6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.\t\n7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.\t\n8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.",
"### Fields\n\n- 'inputs': a 'string' feature.\n- 'targets': a 'string' feature.\n- 'task_source': a 'string' feature.\n- 'task_name': a 'string' feature.\n- 'template_type': a 'string' feature."
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# Natural Instructions v2 Winogrande Tasks\n\n- Project: URL\n- Data source: DataProvenanceInitiative/niv2_submix_original",
"## Details\n\nThis dataset contains all Winogrande examples that were included in the Flan 2022 collection which were orignally published in Super-Natural-Instructions.\n\nThe data is copied from the preprocessed Natural Instructions v2 dataset at DataProvenanceInitiative/niv2_submix_original.\n\nThese tasks are:\n1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.\t\n2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.\t\n3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.\t\n4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.\t\n5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.\t\n6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.\t\n7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.\t\n8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.",
"### Fields\n\n- 'inputs': a 'string' feature.\n- 'targets': a 'string' feature.\n- 'task_source': a 'string' feature.\n- 'task_name': a 'string' feature.\n- 'template_type': a 'string' feature."
] | [
14,
36,
333,
72
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n# Natural Instructions v2 Winogrande Tasks\n\n- Project: URL\n- Data source: DataProvenanceInitiative/niv2_submix_original## Details\n\nThis dataset contains all Winogrande examples that were included in the Flan 2022 collection which were orignally published in Super-Natural-Instructions.\n\nThe data is copied from the preprocessed Natural Instructions v2 dataset at DataProvenanceInitiative/niv2_submix_original.\n\nThese tasks are:\n1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.\t\n2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.\t\n3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.\t\n4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.\t\n5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.\t\n6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.\t\n7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.\t\n8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.### Fields\n\n- 'inputs': a 'string' feature.\n- 'targets': a 'string' feature.\n- 'task_source': a 'string' feature.\n- 'task_name': a 'string' feature.\n- 'template_type': a 'string' feature."
] |
a1183828a0779c565e3afc8b865bcf5c44ab85cf |
# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Reverb/Mistral-7B-LoreWeaver](https://huggingface.co/Reverb/Mistral-7B-LoreWeaver) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T15:58:22.377519](https://huggingface.co/datasets/open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver/blob/main/results_2024-01-05T15-58-22.377519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6377826349872993,
"acc_stderr": 0.03226647554093914,
"acc_norm": 0.6437188756798331,
"acc_norm_stderr": 0.03291664382173368,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809172
},
"harness|hellaswag|10": {
"acc": 0.6292571200955985,
"acc_stderr": 0.004820166002253078,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.0037230107458783913
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155257,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247333,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247333
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3217877094972067,
"acc_stderr": 0.015624236160792579,
"acc_norm": 0.3217877094972067,
"acc_norm_stderr": 0.015624236160792579
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44784876140808344,
"acc_stderr": 0.01270058240476822,
"acc_norm": 0.44784876140808344,
"acc_norm_stderr": 0.01270058240476822
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608763,
"mc2": 0.4215018483148684,
"mc2_stderr": 0.014138981180784167
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.01157061486140935
},
"harness|gsm8k|5": {
"acc": 0.37680060652009095,
"acc_stderr": 0.013347858757829158
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver | [
"region:us"
] | 2024-01-05T15:23:05+00:00 | {"pretty_name": "Evaluation run of Reverb/Mistral-7B-LoreWeaver", "dataset_summary": "Dataset automatically created during the evaluation run of model [Reverb/Mistral-7B-LoreWeaver](https://huggingface.co/Reverb/Mistral-7B-LoreWeaver) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T15:58:22.377519](https://huggingface.co/datasets/open-llm-leaderboard/details_Reverb__Mistral-7B-LoreWeaver/blob/main/results_2024-01-05T15-58-22.377519.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6377826349872993,\n \"acc_stderr\": 0.03226647554093914,\n \"acc_norm\": 0.6437188756798331,\n \"acc_norm_stderr\": 0.03291664382173368,\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809172\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6292571200955985,\n \"acc_stderr\": 0.004820166002253078,\n \"acc_norm\": 0.8329018123879706,\n \"acc_norm_stderr\": 0.0037230107458783913\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155257,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155257\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247333,\n \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247333\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3217877094972067,\n \"acc_stderr\": 0.015624236160792579,\n \"acc_norm\": 0.3217877094972067,\n \"acc_norm_stderr\": 0.015624236160792579\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44784876140808344,\n \"acc_stderr\": 0.01270058240476822,\n \"acc_norm\": 0.44784876140808344,\n \"acc_norm_stderr\": 0.01270058240476822\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n \"mc1_stderr\": 0.015723139524608763,\n \"mc2\": 0.4215018483148684,\n \"mc2_stderr\": 0.014138981180784167\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.01157061486140935\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37680060652009095,\n \"acc_stderr\": 0.013347858757829158\n }\n}\n```", "repo_url": "https://huggingface.co/Reverb/Mistral-7B-LoreWeaver", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-20-48.601124.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-38-45.558356.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-47-35.857036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["**/details_harness|winogrande|5_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["**/details_harness|winogrande|5_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["**/details_harness|winogrande|5_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["**/details_harness|winogrande|5_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T15-58-22.377519.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T15_20_48.601124", "path": ["results_2024-01-05T15-20-48.601124.parquet"]}, {"split": "2024_01_05T15_38_45.558356", "path": ["results_2024-01-05T15-38-45.558356.parquet"]}, {"split": "2024_01_05T15_47_35.857036", "path": ["results_2024-01-05T15-47-35.857036.parquet"]}, {"split": "2024_01_05T15_58_22.377519", "path": ["results_2024-01-05T15-58-22.377519.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T15-58-22.377519.parquet"]}]}]} | 2024-01-05T16:00:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver
Dataset automatically created during the evaluation run of model Reverb/Mistral-7B-LoreWeaver on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T15:58:22.377519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver\n\n\n\nDataset automatically created during the evaluation run of model Reverb/Mistral-7B-LoreWeaver on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:22.377519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver\n\n\n\nDataset automatically created during the evaluation run of model Reverb/Mistral-7B-LoreWeaver on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:22.377519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Reverb/Mistral-7B-LoreWeaver\n\n\n\nDataset automatically created during the evaluation run of model Reverb/Mistral-7B-LoreWeaver on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:22.377519(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7c7c77470cd733ae42f1ed4c4e5d0b280d70f6e1 |
# Wingrande v1.1
## Dataset Description
- **Homepage:** [https://leaderboard.allenai.org/winogrande/submissions/get-started](https://leaderboard.allenai.org/winogrande/submissions/get-started)
- **Size of downloaded dataset files:** 20.37 MB
- **Size of the generated dataset:** 10.50 MB
- **Total amount of disk used:** 30.87 MB
### Dataset Summary
WinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern
2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a
fill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires
commonsense reasoning.
### Data Fields
The data fields are the same among all splits.
- `sentence`: a `string` feature.
- `option1`: a `string` feature.
- `option2`: a `string` feature.
- `answer`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------------------|----:|---------:|---:|
|winogrande_debiased| 9248| 1267|1767|
|winogrande_l |10234| 1267|1767|
|winogrande_m | 2558| 1267|1767|
|winogrande_s | 640| 1267|1767|
|winogrande_xl |40398| 1267|1767|
|winogrande_xs | 160| 1267|1767|
### Citation Information
```
@InProceedings{ai2:winogrande,
title = {WinoGrande: An Adversarial Winograd Schema Challenge at Scale},
authors={Keisuke, Sakaguchi and Ronan, Le Bras and Chandra, Bhagavatula and Yejin, Choi
},
year={2019}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@TevenLeScao](https://github.com/TevenLeScao), [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun) for adding this dataset. | coref-data/winogrande_raw | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-05T15:30:15+00:00 | {"license": "cc-by-4.0", "configs": [{"config_name": "winogrande_debiased", "data_files": [{"split": "train", "path": "winogrande_debiased/train-*.parquet"}, {"split": "validation", "path": "winogrande_debiased/validation-*.parquet"}, {"split": "test", "path": "winogrande_debiased/test-*.parquet"}]}, {"config_name": "winogrande_l", "data_files": [{"split": "train", "path": "winogrande_l/train-*.parquet"}, {"split": "validation", "path": "winogrande_l/validation-*.parquet"}, {"split": "test", "path": "winogrande_l/test-*.parquet"}]}, {"config_name": "winogrande_m", "data_files": [{"split": "train", "path": "winogrande_m/train-*.parquet"}, {"split": "validation", "path": "winogrande_m/validation-*.parquet"}, {"split": "test", "path": "winogrande_m/test-*.parquet"}]}, {"config_name": "winogrande_s", "data_files": [{"split": "train", "path": "winogrande_s/train-*.parquet"}, {"split": "validation", "path": "winogrande_s/validation-*.parquet"}, {"split": "test", "path": "winogrande_s/test-*.parquet"}]}, {"config_name": "winogrande_xl", "data_files": [{"split": "train", "path": "winogrande_xl/train-*.parquet"}, {"split": "validation", "path": "winogrande_xl/validation-*.parquet"}, {"split": "test", "path": "winogrande_xl/test-*.parquet"}]}, {"config_name": "winogrande_xs", "data_files": [{"split": "train", "path": "winogrande_xs/train-*.parquet"}, {"split": "validation", "path": "winogrande_xs/validation-*.parquet"}, {"split": "test", "path": "winogrande_xs/test-*.parquet"}]}]} | 2024-01-19T00:03:36+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
| Wingrande v1.1
==============
Dataset Description
-------------------
* Homepage: URL
* Size of downloaded dataset files: 20.37 MB
* Size of the generated dataset: 10.50 MB
* Total amount of disk used: 30.87 MB
### Dataset Summary
WinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern
2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a
fill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires
commonsense reasoning.
### Data Fields
The data fields are the same among all splits.
* 'sentence': a 'string' feature.
* 'option1': a 'string' feature.
* 'option2': a 'string' feature.
* 'answer': a 'string' feature.
### Data Splits
### Contributions
Thanks to @thomwolf, @TevenLeScao, @patrickvonplaten, @lewtun for adding this dataset.
| [
"### Dataset Summary\n\n\nWinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern\n2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a\nfill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires\ncommonsense reasoning.",
"### Data Fields\n\n\nThe data fields are the same among all splits.\n\n\n* 'sentence': a 'string' feature.\n* 'option1': a 'string' feature.\n* 'option2': a 'string' feature.\n* 'answer': a 'string' feature.",
"### Data Splits",
"### Contributions\n\n\nThanks to @thomwolf, @TevenLeScao, @patrickvonplaten, @lewtun for adding this dataset."
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"### Dataset Summary\n\n\nWinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern\n2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a\nfill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires\ncommonsense reasoning.",
"### Data Fields\n\n\nThe data fields are the same among all splits.\n\n\n* 'sentence': a 'string' feature.\n* 'option1': a 'string' feature.\n* 'option2': a 'string' feature.\n* 'answer': a 'string' feature.",
"### Data Splits",
"### Contributions\n\n\nThanks to @thomwolf, @TevenLeScao, @patrickvonplaten, @lewtun for adding this dataset."
] | [
15,
94,
65,
5,
35
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n### Dataset Summary\n\n\nWinoGrande is a new collection of 44k problems, inspired by Winograd Schema Challenge (Levesque, Davis, and Morgenstern\n2011), but adjusted to improve the scale and robustness against the dataset-specific bias. Formulated as a\nfill-in-a-blank task with binary options, the goal is to choose the right option for a given sentence which requires\ncommonsense reasoning.### Data Fields\n\n\nThe data fields are the same among all splits.\n\n\n* 'sentence': a 'string' feature.\n* 'option1': a 'string' feature.\n* 'option2': a 'string' feature.\n* 'answer': a 'string' feature.### Data Splits### Contributions\n\n\nThanks to @thomwolf, @TevenLeScao, @patrickvonplaten, @lewtun for adding this dataset."
] |
f87fbcc57a8646b42f2c42182c5879977afda650 |
## Data Introduction
Over 1.5 Million synthetically generated ground-truth/[OCR](https://en.wikipedia.org/wiki/Optical_character_recognition) pairs for post correction tasks from our paper "[Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles](https://dl.acm.org/doi/10.1007/978-3-031-43849-3_23)".
Synthetic ground truth (SGT) sentences have been mined from the [ar𝜒iv Bulk Downloads](https://info.arxiv.org/help/bulk_data/index.html) source documents,
and Optical Character Recognition (OCR)
sentences have been generated with the [Tesseract](https://github.com/tesseract-ocr/tesseract) OCR engine on the PDF pages generated from compiled source documents.
SGT/OCR pairs come from astronomy articles in the years 1991-2011.
No page augmentation has been applied to any of the PDF documents (i.e. these are "clean" pages without warping, dust, etc.)
## Resources
### Dataset Versions
* V0 (original released with original paper) is available [here](https://zenodo.org/records/8006584)
## Citation
Please reference the following if you make use of this dataset:
```
@inproceedings{10.1007/978-3-031-43849-3_23,
author = {Naiman, J. P. and Cosillo, Morgan G. and Williams, Peter K. G. and Goodman, Alyssa},
title = {Large Synthetic Data From the arχiv For OCR Post Correction Of Historic Scientific Articles},
year = {2023},
isbn = {978-3-031-43848-6},
publisher = {Springer-Verlag},
address = {Berlin, Heidelberg},
url = {https://doi.org/10.1007/978-3-031-43849-3_23},
doi = {10.1007/978-3-031-43849-3_23},
abstract = {Historical scientific articles often require Optical Character Recognition (OCR) to transform scanned documents into machine-readable text, a process that often produces errors. We present a pipeline for the generation of a synthetic ground truth/OCR dataset to correct the OCR results of the astrophysics literature holdings of the NASA Astrophysics Data System (ADS). By mining the arχiv we create, to the authors’ knowledge, the largest scientific synthetic ground truth/OCR post correction dataset of 203,354,393 character pairs. Baseline models trained with this dataset find the mean improvement in character and word error rates of 7.71\% and 18.82\% for historical OCR text, respectively. Interactive dashboards to explore the dataset are available online: , and data and code, are hosted on GitHub: .},
booktitle = {Linking Theory and Practice of Digital Libraries: 27th International Conference on Theory and Practice of Digital Libraries, TPDL 2023, Zadar, Croatia, September 26–29, 2023, Proceedings},
pages = {265–274},
numpages = {10},
keywords = {scholarly document processing, optical character recognition, astronomy},
location = {Zadar, Croatia}
}
```
| ReadingTimeMachine/rtm-sgt-ocr-v1 | [
"task_categories:text-classification",
"task_categories:translation",
"size_categories:1M<n<10M",
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-05T15:51:55+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-classification", "translation"]} | 2024-01-05T16:32:38+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-classification #task_categories-translation #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us
|
## Data Introduction
Over 1.5 Million synthetically generated ground-truth/OCR pairs for post correction tasks from our paper "Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles".
Synthetic ground truth (SGT) sentences have been mined from the ar𝜒iv Bulk Downloads source documents,
and Optical Character Recognition (OCR)
sentences have been generated with the Tesseract OCR engine on the PDF pages generated from compiled source documents.
SGT/OCR pairs come from astronomy articles in the years 1991-2011.
No page augmentation has been applied to any of the PDF documents (i.e. these are "clean" pages without warping, dust, etc.)
## Resources
### Dataset Versions
* V0 (original released with original paper) is available here
Please reference the following if you make use of this dataset:
| [
"## Data Introduction\n\nOver 1.5 Million synthetically generated ground-truth/OCR pairs for post correction tasks from our paper \"Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles\".\n\nSynthetic ground truth (SGT) sentences have been mined from the ar𝜒iv Bulk Downloads source documents, \nand Optical Character Recognition (OCR) \nsentences have been generated with the Tesseract OCR engine on the PDF pages generated from compiled source documents.\n\nSGT/OCR pairs come from astronomy articles in the years 1991-2011.\n\nNo page augmentation has been applied to any of the PDF documents (i.e. these are \"clean\" pages without warping, dust, etc.)",
"## Resources",
"### Dataset Versions\n\n* V0 (original released with original paper) is available here\n\nPlease reference the following if you make use of this dataset:"
] | [
"TAGS\n#task_categories-text-classification #task_categories-translation #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n",
"## Data Introduction\n\nOver 1.5 Million synthetically generated ground-truth/OCR pairs for post correction tasks from our paper \"Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles\".\n\nSynthetic ground truth (SGT) sentences have been mined from the ar𝜒iv Bulk Downloads source documents, \nand Optical Character Recognition (OCR) \nsentences have been generated with the Tesseract OCR engine on the PDF pages generated from compiled source documents.\n\nSGT/OCR pairs come from astronomy articles in the years 1991-2011.\n\nNo page augmentation has been applied to any of the PDF documents (i.e. these are \"clean\" pages without warping, dust, etc.)",
"## Resources",
"### Dataset Versions\n\n* V0 (original released with original paper) is available here\n\nPlease reference the following if you make use of this dataset:"
] | [
50,
174,
3,
32
] | [
"passage: TAGS\n#task_categories-text-classification #task_categories-translation #size_categories-1M<n<10M #language-English #license-apache-2.0 #region-us \n## Data Introduction\n\nOver 1.5 Million synthetically generated ground-truth/OCR pairs for post correction tasks from our paper \"Large Synthetic Data from the ar𝜒iv for OCR Post Correction of Historic Scientific Articles\".\n\nSynthetic ground truth (SGT) sentences have been mined from the ar𝜒iv Bulk Downloads source documents, \nand Optical Character Recognition (OCR) \nsentences have been generated with the Tesseract OCR engine on the PDF pages generated from compiled source documents.\n\nSGT/OCR pairs come from astronomy articles in the years 1991-2011.\n\nNo page augmentation has been applied to any of the PDF documents (i.e. these are \"clean\" pages without warping, dust, etc.)## Resources### Dataset Versions\n\n* V0 (original released with original paper) is available here\n\nPlease reference the following if you make use of this dataset:"
] |
159eb26f4bbec6f015b94e9d5091bd84d09ebfb0 |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-9984",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T15:51:45.493023](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-9984/blob/main/results_2024-01-05T15-51-45.493023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6244808886385845,
"acc_stderr": 0.03243476112893074,
"acc_norm": 0.6323862837346002,
"acc_norm_stderr": 0.033106091377154055,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5369936269743123,
"mc2_stderr": 0.015292357434384245
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642667,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893454
},
"harness|hellaswag|10": {
"acc": 0.6316470822545309,
"acc_stderr": 0.0048137199528299705,
"acc_norm": 0.8218482374029078,
"acc_norm_stderr": 0.003818584384635532
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.03773809990686934,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.03773809990686934
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642507,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642507
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.02435958146539699,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.02435958146539699
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.02934457250063433,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.02934457250063433
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243741,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459754,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459754
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381387,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381387
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238163,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559807,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559807
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.0193733324207245,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.0193733324207245
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5369936269743123,
"mc2_stderr": 0.015292357434384245
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908192
},
"harness|gsm8k|5": {
"acc": 0.25246398786959817,
"acc_stderr": 0.011966250044833995
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-9984 | [
"region:us"
] | 2024-01-05T15:54:03+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-9984\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T15:51:45.493023](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-9984/blob/main/results_2024-01-05T15-51-45.493023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6244808886385845,\n \"acc_stderr\": 0.03243476112893074,\n \"acc_norm\": 0.6323862837346002,\n \"acc_norm_stderr\": 0.033106091377154055,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5369936269743123,\n \"mc2_stderr\": 0.015292357434384245\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642667,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893454\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6316470822545309,\n \"acc_stderr\": 0.0048137199528299705,\n \"acc_norm\": 0.8218482374029078,\n \"acc_norm_stderr\": 0.003818584384635532\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n \"acc_stderr\": 0.03773809990686934,\n \"acc_norm\": 0.7152777777777778,\n \"acc_norm_stderr\": 0.03773809990686934\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642507,\n \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642507\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.02435958146539699,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.02435958146539699\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.02934457250063433,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.02934457250063433\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243741,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243741\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459754,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459754\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381387,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381387\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532337,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532337\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n \"acc_stderr\": 0.01577491142238163,\n \"acc_norm\": 0.3340782122905028,\n \"acc_norm_stderr\": 0.01577491142238163\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n \"acc_stderr\": 0.012702317490559807,\n \"acc_norm\": 0.4485006518904824,\n \"acc_norm_stderr\": 0.012702317490559807\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6437908496732027,\n \"acc_stderr\": 0.0193733324207245,\n \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.0193733324207245\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5369936269743123,\n \"mc2_stderr\": 0.015292357434384245\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908192\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.25246398786959817,\n \"acc_stderr\": 0.011966250044833995\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-51-45.493023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["**/details_harness|winogrande|5_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T15-51-45.493023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T15_51_45.493023", "path": ["results_2024-01-05T15-51-45.493023.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T15-51-45.493023.parquet"]}]}]} | 2024-01-05T15:54:25+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984
Dataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T15:51:45.493023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:51:45.493023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:51:45.493023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
209,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-9984 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T15:51:45.493023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
a29d52273f0adba5e7ffa7f281020eca047d03ff |
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-15936",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T15:58:44.045673](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-15936/blob/main/results_2024-01-05T15-58-44.045673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6262378503301176,
"acc_stderr": 0.0325953364628108,
"acc_norm": 0.6291102653356665,
"acc_norm_stderr": 0.03324343974226206,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5510541354864276,
"mc2_stderr": 0.015394064463612044
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097858,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893456
},
"harness|hellaswag|10": {
"acc": 0.6346345349531965,
"acc_stderr": 0.004805483767055347,
"acc_norm": 0.8213503286197968,
"acc_norm_stderr": 0.003822758343922911
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593566,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593566
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920362,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920362
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3307262569832402,
"acc_stderr": 0.01573502625896612,
"acc_norm": 0.3307262569832402,
"acc_norm_stderr": 0.01573502625896612
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4471968709256845,
"acc_stderr": 0.012698825252435106,
"acc_norm": 0.4471968709256845,
"acc_norm_stderr": 0.012698825252435106
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5510541354864276,
"mc2_stderr": 0.015394064463612044
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174787
},
"harness|gsm8k|5": {
"acc": 0.5435936315390447,
"acc_stderr": 0.013720038270485332
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-15936 | [
"region:us"
] | 2024-01-05T16:01:06+00:00 | {"pretty_name": "Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936", "dataset_summary": "Dataset automatically created during the evaluation run of model [EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-15936\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T15:58:44.045673](https://huggingface.co/datasets/open-llm-leaderboard/details_EmbeddedLLM__Mistral-7B-Merge-14-v0.3-ft-step-15936/blob/main/results_2024-01-05T15-58-44.045673.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6262378503301176,\n \"acc_stderr\": 0.0325953364628108,\n \"acc_norm\": 0.6291102653356665,\n \"acc_norm_stderr\": 0.03324343974226206,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5510541354864276,\n \"mc2_stderr\": 0.015394064463612044\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097858,\n \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893456\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6346345349531965,\n \"acc_stderr\": 0.004805483767055347,\n \"acc_norm\": 0.8213503286197968,\n \"acc_norm_stderr\": 0.003822758343922911\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593566,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593566\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920362,\n \"acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920362\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3307262569832402,\n \"acc_stderr\": 0.01573502625896612,\n \"acc_norm\": 0.3307262569832402,\n \"acc_norm_stderr\": 0.01573502625896612\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4471968709256845,\n \"acc_stderr\": 0.012698825252435106,\n \"acc_norm\": 0.4471968709256845,\n \"acc_norm_stderr\": 0.012698825252435106\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5510541354864276,\n \"mc2_stderr\": 0.015394064463612044\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174787\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5435936315390447,\n \"acc_stderr\": 0.013720038270485332\n }\n}\n```", "repo_url": "https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-44.045673.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["**/details_harness|winogrande|5_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T15-58-44.045673.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T15_58_44.045673", "path": ["results_2024-01-05T15-58-44.045673.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T15-58-44.045673.parquet"]}]}]} | 2024-01-05T16:01:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936
Dataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T15:58:44.045673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:44.045673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:44.045673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
209,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936\n\n\n\nDataset automatically created during the evaluation run of model EmbeddedLLM/Mistral-7B-Merge-14-v0.3-ft-step-15936 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T15:58:44.045673(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
4ca5df0d01a75f2b4d34fbab2e8e20fc22d6b0cf | # Dataset Card for "araproje_hellaswag_en_conf1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_en_conf1 | [
"region:us"
] | 2024-01-05T16:06:37+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 149738.0, "num_examples": 250}, {"name": "dev", "num_bytes": 5989.52, "num_examples": 10}], "download_size": 91075, "dataset_size": 155727.52}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}, {"split": "dev", "path": "data/dev-*"}]}]} | 2024-01-05T19:43:31+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_hellaswag_en_conf1"
More Information needed | [
"# Dataset Card for \"araproje_hellaswag_en_conf1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_hellaswag_en_conf1\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_hellaswag_en_conf1\"\n\nMore Information needed"
] |
9d687e15dd3e17b03256d5126b5623fe4951a031 | # Dataset Card for "araproje_hellaswag_en_conf2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_en_conf2 | [
"region:us"
] | 2024-01-05T16:06:41+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 149738.0, "num_examples": 250}], "download_size": 80579, "dataset_size": 149738.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T16:06:43+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_hellaswag_en_conf2"
More Information needed | [
"# Dataset Card for \"araproje_hellaswag_en_conf2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_hellaswag_en_conf2\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_hellaswag_en_conf2\"\n\nMore Information needed"
] |
18fa954670f0a08c9a673a0a428e93f3ae0eda88 |
# The original Winograd Schema Challenge (WSC) as hosted by Ernest Davis
## Dataset Description
- **Homepage:** https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/WS.html
- **Paper:** https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.729.9814&rep=rep1&type=pdf
### Dataset Summary
The original Winograd Schema Challenge (WSC) consisted of 136 schemas resulting in 273 problems. This was later expanded to 150 schemas resulting in 285 problems.
A Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is
resolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its
resolution. The schema takes its name from a well-known example by Terry Winograd:
> The city councilmen refused the demonstrators a permit because they [feared/advocated] violence.
If the word is "feared", then "they" presumably refers to the city council; if it is "advocated" then "they"
presumably refers to the demonstrators.
## Dataset Structure
### Data Instances
Each instance contains a text passage with a designated pronoun and two possible answers indicating which entity in
the passage the pronoun represents. An example instance looks like the following:
```python
{
'label': 0,
'options': ['The city councilmen', 'The demonstrators'],
'pronoun': 'they',
'pronoun_loc': 63,
'quote': 'they feared violence',
'quote_loc': 63,
'source': '(Winograd 1972)',
'text': 'The city councilmen refused the demonstrators a permit because they feared violence.'
}
```
### Data Fields
- `text` (str): The text sequence
- `options` (list[str]): The two entity options that the pronoun may be referring to
- `label` (int): The index of the correct option in the `options` field
- `pronoun` (str): The pronoun in the sequence to be resolved
- `pronoun_loc` (int): The starting position of the pronoun in the sequence
- `quote` (str): The substr with the key action or context surrounding the pronoun
- `quote_loc` (int): The starting position of the quote in the sequence
- `source` (str): A description of the source who contributed the example
### Licensing Information
This work is licensed under a [Creative Commons Attribution 4.0 International
License](https://creativecommons.org/licenses/by/4.0/).
### Citation Information
The Winograd Schema Challenge including many of the examples here was proposed by
[Levesque et al 2012](https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.729.9814&rep=rep1&type=pdf):
```
@inproceedings{levesque2012winograd,
title={The winograd schema challenge},
author={Levesque, Hector and Davis, Ernest and Morgenstern, Leora},
booktitle={Thirteenth International Conference on the Principles of Knowledge Representation and Reasoning},
year={2012},
organization={Citeseer}
}
```
### Contributions
Modified from loading script of: [@joeddav](https://github.com/joeddav). | coref-data/davis_wsc_raw | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-05T16:13:47+00:00 | {"license": "cc-by-4.0", "configs": [{"config_name": "wsc273", "data_files": [{"split": "test", "path": "wsc273/test-*.parquet"}]}, {"config_name": "wsc285", "data_files": [{"split": "test", "path": "wsc285/test-*.parquet"}]}]} | 2024-01-19T00:03:40+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
# The original Winograd Schema Challenge (WSC) as hosted by Ernest Davis
## Dataset Description
- Homepage: URL
- Paper: URL
### Dataset Summary
The original Winograd Schema Challenge (WSC) consisted of 136 schemas resulting in 273 problems. This was later expanded to 150 schemas resulting in 285 problems.
A Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is
resolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its
resolution. The schema takes its name from a well-known example by Terry Winograd:
> The city councilmen refused the demonstrators a permit because they [feared/advocated] violence.
If the word is "feared", then "they" presumably refers to the city council; if it is "advocated" then "they"
presumably refers to the demonstrators.
## Dataset Structure
### Data Instances
Each instance contains a text passage with a designated pronoun and two possible answers indicating which entity in
the passage the pronoun represents. An example instance looks like the following:
### Data Fields
- 'text' (str): The text sequence
- 'options' (list[str]): The two entity options that the pronoun may be referring to
- 'label' (int): The index of the correct option in the 'options' field
- 'pronoun' (str): The pronoun in the sequence to be resolved
- 'pronoun_loc' (int): The starting position of the pronoun in the sequence
- 'quote' (str): The substr with the key action or context surrounding the pronoun
- 'quote_loc' (int): The starting position of the quote in the sequence
- 'source' (str): A description of the source who contributed the example
### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0 International
License.
The Winograd Schema Challenge including many of the examples here was proposed by
Levesque et al 2012:
### Contributions
Modified from loading script of: @joeddav. | [
"# The original Winograd Schema Challenge (WSC) as hosted by Ernest Davis",
"## Dataset Description\n\n- Homepage: URL\n- Paper: URL",
"### Dataset Summary\n\nThe original Winograd Schema Challenge (WSC) consisted of 136 schemas resulting in 273 problems. This was later expanded to 150 schemas resulting in 285 problems.\n\nA Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is\nresolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its\nresolution. The schema takes its name from a well-known example by Terry Winograd:\n\n> The city councilmen refused the demonstrators a permit because they [feared/advocated] violence.\n\nIf the word is \"feared\", then \"they\" presumably refers to the city council; if it is \"advocated\" then \"they\"\npresumably refers to the demonstrators.",
"## Dataset Structure",
"### Data Instances\n\nEach instance contains a text passage with a designated pronoun and two possible answers indicating which entity in\nthe passage the pronoun represents. An example instance looks like the following:",
"### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example",
"### Licensing Information\n\nThis work is licensed under a Creative Commons Attribution 4.0 International\nLicense.\n\n\n\nThe Winograd Schema Challenge including many of the examples here was proposed by\nLevesque et al 2012:",
"### Contributions\n\nModified from loading script of: @joeddav."
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"# The original Winograd Schema Challenge (WSC) as hosted by Ernest Davis",
"## Dataset Description\n\n- Homepage: URL\n- Paper: URL",
"### Dataset Summary\n\nThe original Winograd Schema Challenge (WSC) consisted of 136 schemas resulting in 273 problems. This was later expanded to 150 schemas resulting in 285 problems.\n\nA Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is\nresolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its\nresolution. The schema takes its name from a well-known example by Terry Winograd:\n\n> The city councilmen refused the demonstrators a permit because they [feared/advocated] violence.\n\nIf the word is \"feared\", then \"they\" presumably refers to the city council; if it is \"advocated\" then \"they\"\npresumably refers to the demonstrators.",
"## Dataset Structure",
"### Data Instances\n\nEach instance contains a text passage with a designated pronoun and two possible answers indicating which entity in\nthe passage the pronoun represents. An example instance looks like the following:",
"### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example",
"### Licensing Information\n\nThis work is licensed under a Creative Commons Attribution 4.0 International\nLicense.\n\n\n\nThe Winograd Schema Challenge including many of the examples here was proposed by\nLevesque et al 2012:",
"### Contributions\n\nModified from loading script of: @joeddav."
] | [
15,
19,
12,
187,
6,
46,
163,
44,
18
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n# The original Winograd Schema Challenge (WSC) as hosted by Ernest Davis## Dataset Description\n\n- Homepage: URL\n- Paper: URL### Dataset Summary\n\nThe original Winograd Schema Challenge (WSC) consisted of 136 schemas resulting in 273 problems. This was later expanded to 150 schemas resulting in 285 problems.\n\nA Winograd schema is a pair of sentences that differ in only one or two words and that contain an ambiguity that is\nresolved in opposite ways in the two sentences and requires the use of world knowledge and reasoning for its\nresolution. The schema takes its name from a well-known example by Terry Winograd:\n\n> The city councilmen refused the demonstrators a permit because they [feared/advocated] violence.\n\nIf the word is \"feared\", then \"they\" presumably refers to the city council; if it is \"advocated\" then \"they\"\npresumably refers to the demonstrators.## Dataset Structure### Data Instances\n\nEach instance contains a text passage with a designated pronoun and two possible answers indicating which entity in\nthe passage the pronoun represents. An example instance looks like the following:### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example### Licensing Information\n\nThis work is licensed under a Creative Commons Attribution 4.0 International\nLicense.\n\n\n\nThe Winograd Schema Challenge including many of the examples here was proposed by\nLevesque et al 2012:"
] |
3b1e210a9267b0ec6f37019fe831e44e1294ea84 |
# The Modified Winograd Schema Challenge (MWSC)
## Dataset Description
- **Homepage:** [http://decanlp.com](http://decanlp.com)
- **Repository:** https://github.com/salesforce/decaNLP
- **Paper:** [The Natural Language Decathlon: Multitask Learning as Question Answering](https://arxiv.org/abs/1806.08730)
- **Point of Contact:** [Bryan McCann](mailto:[email protected]), [Nitish Shirish Keskar](mailto:[email protected])
- **Size of downloaded dataset files:** 19.20 kB
- **Size of the generated dataset:** 39.35 kB
- **Total amount of disk used:** 58.55 kB
### Dataset Summary
Examples taken from the Winograd Schema Challenge modified to ensure that answers are a single word from the context.
This Modified Winograd Schema Challenge (MWSC) ensures that scores are neither inflated nor deflated by oddities in phrasing.
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 0.02 MB
- **Size of the generated dataset:** 0.04 MB
- **Total amount of disk used:** 0.06 MB
An example looks as follows:
```
{
"sentence": "The city councilmen refused the demonstrators a permit because they feared violence.",
"question": "Who feared violence?",
"options": [ "councilmen", "demonstrators" ],
"answer": "councilmen"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `sentence`: a `string` feature.
- `question`: a `string` feature.
- `options`: a `list` of `string` features.
- `answer`: a `string` feature.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default| 80| 82| 100|
### Licensing Information
Our code for running decaNLP has been open sourced under BSD-3-Clause.
We chose to restrict decaNLP to datasets that were free and publicly accessible for research, but you should check their individual terms if you deviate from this use case.
From the [Winograd Schema Challenge](https://cs.nyu.edu/~davise/papers/WinogradSchemas/WS.html):
> Both versions of the collections are licenced under a [Creative Commons Attribution 4.0 International License](http://creativecommons.org/licenses/by/4.0/).
### Citation Information
If you use this in your work, please cite:
```
@inproceedings{10.5555/3031843.3031909,
author = {Levesque, Hector J. and Davis, Ernest and Morgenstern, Leora},
title = {The Winograd Schema Challenge},
year = {2012},
isbn = {9781577355601},
publisher = {AAAI Press},
abstract = {In this paper, we present an alternative to the Turing Test that has some conceptual and practical advantages. A Wino-grad schema is a pair of sentences that differ only in one or two words and that contain a referential ambiguity that is resolved in opposite directions in the two sentences. We have compiled a collection of Winograd schemas, designed so that the correct answer is obvious to the human reader, but cannot easily be found using selectional restrictions or statistical techniques over text corpora. A contestant in the Winograd Schema Challenge is presented with a collection of one sentence from each pair, and required to achieve human-level accuracy in choosing the correct disambiguation.},
booktitle = {Proceedings of the Thirteenth International Conference on Principles of Knowledge Representation and Reasoning},
pages = {552–561},
numpages = {10},
location = {Rome, Italy},
series = {KR'12}
}
@article{McCann2018decaNLP,
title={The Natural Language Decathlon: Multitask Learning as Question Answering},
author={Bryan McCann and Nitish Shirish Keskar and Caiming Xiong and Richard Socher},
journal={arXiv preprint arXiv:1806.08730},
year={2018}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@ghomasHudson](https://github.com/ghomasHudson), [@lhoestq](https://github.com/lhoestq) for adding this dataset. | coref-data/mwsc_raw | [
"license:cc-by-4.0",
"arxiv:1806.08730",
"region:us"
] | 2024-01-05T16:14:07+00:00 | {"license": "cc-by-4.0"} | 2024-01-19T00:03:39+00:00 | [
"1806.08730"
] | [] | TAGS
#license-cc-by-4.0 #arxiv-1806.08730 #region-us
| The Modified Winograd Schema Challenge (MWSC)
=============================================
Dataset Description
-------------------
* Homepage: URL
* Repository: URL
* Paper: The Natural Language Decathlon: Multitask Learning as Question Answering
* Point of Contact: Bryan McCann, Nitish Shirish Keskar
* Size of downloaded dataset files: 19.20 kB
* Size of the generated dataset: 39.35 kB
* Total amount of disk used: 58.55 kB
### Dataset Summary
Examples taken from the Winograd Schema Challenge modified to ensure that answers are a single word from the context.
This Modified Winograd Schema Challenge (MWSC) ensures that scores are neither inflated nor deflated by oddities in phrasing.
Dataset Structure
-----------------
### Data Instances
#### default
* Size of downloaded dataset files: 0.02 MB
* Size of the generated dataset: 0.04 MB
* Total amount of disk used: 0.06 MB
An example looks as follows:
### Data Fields
The data fields are the same among all splits.
#### default
* 'sentence': a 'string' feature.
* 'question': a 'string' feature.
* 'options': a 'list' of 'string' features.
* 'answer': a 'string' feature.
### Data Splits
### Licensing Information
Our code for running decaNLP has been open sourced under BSD-3-Clause.
We chose to restrict decaNLP to datasets that were free and publicly accessible for research, but you should check their individual terms if you deviate from this use case.
From the Winograd Schema Challenge:
>
> Both versions of the collections are licenced under a Creative Commons Attribution 4.0 International License.
>
>
>
If you use this in your work, please cite:
### Contributions
Thanks to @thomwolf, @lewtun, @ghomasHudson, @lhoestq for adding this dataset.
| [
"### Dataset Summary\n\n\nExamples taken from the Winograd Schema Challenge modified to ensure that answers are a single word from the context.\nThis Modified Winograd Schema Challenge (MWSC) ensures that scores are neither inflated nor deflated by oddities in phrasing.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"#### default\n\n\n* Size of downloaded dataset files: 0.02 MB\n* Size of the generated dataset: 0.04 MB\n* Total amount of disk used: 0.06 MB\n\n\nAn example looks as follows:",
"### Data Fields\n\n\nThe data fields are the same among all splits.",
"#### default\n\n\n* 'sentence': a 'string' feature.\n* 'question': a 'string' feature.\n* 'options': a 'list' of 'string' features.\n* 'answer': a 'string' feature.",
"### Data Splits",
"### Licensing Information\n\n\nOur code for running decaNLP has been open sourced under BSD-3-Clause.\n\n\nWe chose to restrict decaNLP to datasets that were free and publicly accessible for research, but you should check their individual terms if you deviate from this use case.\n\n\nFrom the Winograd Schema Challenge:\n\n\n\n> \n> Both versions of the collections are licenced under a Creative Commons Attribution 4.0 International License.\n> \n> \n> \n\n\nIf you use this in your work, please cite:",
"### Contributions\n\n\nThanks to @thomwolf, @lewtun, @ghomasHudson, @lhoestq for adding this dataset."
] | [
"TAGS\n#license-cc-by-4.0 #arxiv-1806.08730 #region-us \n",
"### Dataset Summary\n\n\nExamples taken from the Winograd Schema Challenge modified to ensure that answers are a single word from the context.\nThis Modified Winograd Schema Challenge (MWSC) ensures that scores are neither inflated nor deflated by oddities in phrasing.\n\n\nDataset Structure\n-----------------",
"### Data Instances",
"#### default\n\n\n* Size of downloaded dataset files: 0.02 MB\n* Size of the generated dataset: 0.04 MB\n* Total amount of disk used: 0.06 MB\n\n\nAn example looks as follows:",
"### Data Fields\n\n\nThe data fields are the same among all splits.",
"#### default\n\n\n* 'sentence': a 'string' feature.\n* 'question': a 'string' feature.\n* 'options': a 'list' of 'string' features.\n* 'answer': a 'string' feature.",
"### Data Splits",
"### Licensing Information\n\n\nOur code for running decaNLP has been open sourced under BSD-3-Clause.\n\n\nWe chose to restrict decaNLP to datasets that were free and publicly accessible for research, but you should check their individual terms if you deviate from this use case.\n\n\nFrom the Winograd Schema Challenge:\n\n\n\n> \n> Both versions of the collections are licenced under a Creative Commons Attribution 4.0 International License.\n> \n> \n> \n\n\nIf you use this in your work, please cite:",
"### Contributions\n\n\nThanks to @thomwolf, @lewtun, @ghomasHudson, @lhoestq for adding this dataset."
] | [
24,
74,
6,
44,
17,
55,
5,
110,
34
] | [
"passage: TAGS\n#license-cc-by-4.0 #arxiv-1806.08730 #region-us \n### Dataset Summary\n\n\nExamples taken from the Winograd Schema Challenge modified to ensure that answers are a single word from the context.\nThis Modified Winograd Schema Challenge (MWSC) ensures that scores are neither inflated nor deflated by oddities in phrasing.\n\n\nDataset Structure\n-----------------### Data Instances#### default\n\n\n* Size of downloaded dataset files: 0.02 MB\n* Size of the generated dataset: 0.04 MB\n* Total amount of disk used: 0.06 MB\n\n\nAn example looks as follows:### Data Fields\n\n\nThe data fields are the same among all splits.#### default\n\n\n* 'sentence': a 'string' feature.\n* 'question': a 'string' feature.\n* 'options': a 'list' of 'string' features.\n* 'answer': a 'string' feature.### Data Splits### Licensing Information\n\n\nOur code for running decaNLP has been open sourced under BSD-3-Clause.\n\n\nWe chose to restrict decaNLP to datasets that were free and publicly accessible for research, but you should check their individual terms if you deviate from this use case.\n\n\nFrom the Winograd Schema Challenge:\n\n\n\n> \n> Both versions of the collections are licenced under a Creative Commons Attribution 4.0 International License.\n> \n> \n> \n\n\nIf you use this in your work, please cite:### Contributions\n\n\nThanks to @thomwolf, @lewtun, @ghomasHudson, @lhoestq for adding this dataset."
] |
d41604b6f3c0a456a05d294485bb50fac43c7d6c | # Dolly 15k Curated
## Dataset Details
### Dataset Description
A filtered and curated dataset version of https://huggingface.co/datasets/databricks/databricks-dolly-15k. Saved in HF Chat format. The result is a high quality dataset for SFT.
- **Created by:** [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai)
- **License:** CC BY-SA 3.0
## Dataset Structure
We structure the dataset using the format commonly used as input into [Hugging Face Chat Templates](https://huggingface.co/docs/transformers/chat_templating). Where present, the context field text has been appending to the instruction in OpenAI style `Text: """..."""` format.
```
[
{"role": "user", "content": "Hello, how are you?"},
{"role": "assistant", "content": "I'm doing great. How can I help you today?"}
]
```
## Dataset Creation
### Source Data
- **Source Dataset:** https://huggingface.co/datasets/argilla/databricks-dolly-15k-curated-multilingual and https://huggingface.co/datasets/databricks/databricks-dolly-15k
#### Data Collection and Processing
We started with https://huggingface.co/datasets/argilla/databricks-dolly-15k-curated-multilingual (en split only) which is a manually curated version of https://huggingface.co/datasets/databricks/databricks-dolly-15k.
As well as formatting to fit HF Chat style, we removed many duplicates based on the instruction text. This ensures the dataset is diverse and not repetitive.
# License
- **License:** CC BY-SA 3.0
This dataset is usable for commercial purposes. Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:
- Wikipedia (various pages) - https://www.wikipedia.org/ - Copyright © Wikipedia editors and contributors.
- Databricks (https://www.databricks.com) - Copyright © Databricks
# Contact
Created by [dctanner](https://huggingface.co/dctanner) and the team at [Sablo AI](https://sablo.ai) | sablo/dolly_curated | [
"region:us"
] | 2024-01-05T16:26:36+00:00 | {"dataset_info": {"features": [{"name": "messages", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "category", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11515591, "num_examples": 13952}, {"name": "test", "num_bytes": 573809, "num_examples": 735}], "download_size": 7032039, "dataset_size": 12089400}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-11T10:38:41+00:00 | [] | [] | TAGS
#region-us
| # Dolly 15k Curated
## Dataset Details
### Dataset Description
A filtered and curated dataset version of URL Saved in HF Chat format. The result is a high quality dataset for SFT.
- Created by: dctanner and the team at Sablo AI
- License: CC BY-SA 3.0
## Dataset Structure
We structure the dataset using the format commonly used as input into Hugging Face Chat Templates. Where present, the context field text has been appending to the instruction in OpenAI style 'Text: """..."""' format.
## Dataset Creation
### Source Data
- Source Dataset: URL and URL
#### Data Collection and Processing
We started with URL (en split only) which is a manually curated version of URL
As well as formatting to fit HF Chat style, we removed many duplicates based on the instruction text. This ensures the dataset is diverse and not repetitive.
# License
- License: CC BY-SA 3.0
This dataset is usable for commercial purposes. Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:
- Wikipedia (various pages) - URL - Copyright © Wikipedia editors and contributors.
- Databricks (URL) - Copyright © Databricks
# Contact
Created by dctanner and the team at Sablo AI | [
"# Dolly 15k Curated",
"## Dataset Details",
"### Dataset Description\n\nA filtered and curated dataset version of URL Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: CC BY-SA 3.0",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates. Where present, the context field text has been appending to the instruction in OpenAI style 'Text: \"\"\"...\"\"\"' format.",
"## Dataset Creation",
"### Source Data\n\n- Source Dataset: URL and URL",
"#### Data Collection and Processing\n\nWe started with URL (en split only) which is a manually curated version of URL\n\nAs well as formatting to fit HF Chat style, we removed many duplicates based on the instruction text. This ensures the dataset is diverse and not repetitive.",
"# License\n\n- License: CC BY-SA 3.0\n\nThis dataset is usable for commercial purposes. Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:\n- Wikipedia (various pages) - URL - Copyright © Wikipedia editors and contributors.\n- Databricks (URL) - Copyright © Databricks",
"# Contact\n\nCreated by dctanner and the team at Sablo AI"
] | [
"TAGS\n#region-us \n",
"# Dolly 15k Curated",
"## Dataset Details",
"### Dataset Description\n\nA filtered and curated dataset version of URL Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: CC BY-SA 3.0",
"## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates. Where present, the context field text has been appending to the instruction in OpenAI style 'Text: \"\"\"...\"\"\"' format.",
"## Dataset Creation",
"### Source Data\n\n- Source Dataset: URL and URL",
"#### Data Collection and Processing\n\nWe started with URL (en split only) which is a manually curated version of URL\n\nAs well as formatting to fit HF Chat style, we removed many duplicates based on the instruction text. This ensures the dataset is diverse and not repetitive.",
"# License\n\n- License: CC BY-SA 3.0\n\nThis dataset is usable for commercial purposes. Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:\n- Wikipedia (various pages) - URL - Copyright © Wikipedia editors and contributors.\n- Databricks (URL) - Copyright © Databricks",
"# Contact\n\nCreated by dctanner and the team at Sablo AI"
] | [
6,
7,
4,
59,
58,
5,
12,
63,
81,
15
] | [
"passage: TAGS\n#region-us \n# Dolly 15k Curated## Dataset Details### Dataset Description\n\nA filtered and curated dataset version of URL Saved in HF Chat format. The result is a high quality dataset for SFT.\n\n- Created by: dctanner and the team at Sablo AI\n- License: CC BY-SA 3.0## Dataset Structure\n\nWe structure the dataset using the format commonly used as input into Hugging Face Chat Templates. Where present, the context field text has been appending to the instruction in OpenAI style 'Text: \"\"\"...\"\"\"' format.## Dataset Creation### Source Data\n\n- Source Dataset: URL and URL#### Data Collection and Processing\n\nWe started with URL (en split only) which is a manually curated version of URL\n\nAs well as formatting to fit HF Chat style, we removed many duplicates based on the instruction text. This ensures the dataset is diverse and not repetitive.# License\n\n- License: CC BY-SA 3.0\n\nThis dataset is usable for commercial purposes. Certain categories of material in the dataset include materials from the following sources, licensed under the CC BY-SA 3.0 license:\n- Wikipedia (various pages) - URL - Copyright © Wikipedia editors and contributors.\n- Databricks (URL) - Copyright © Databricks# Contact\n\nCreated by dctanner and the team at Sablo AI"
] |
bca444559795e1b934118b98294124e5e6b05add | # Dataset Card for "nft_prediction_1_with_dates"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hongerzh/nft_prediction_1_with_dates | [
"region:us"
] | 2024-01-05T16:29:34+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "float64"}, {"name": "time", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 802026316.85, "num_examples": 3495}, {"name": "validation", "num_bytes": 89427799.0, "num_examples": 500}, {"name": "test", "num_bytes": 182630437.0, "num_examples": 997}], "download_size": 856594702, "dataset_size": 1074084552.85}} | 2024-01-05T16:34:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "nft_prediction_1_with_dates"
More Information needed | [
"# Dataset Card for \"nft_prediction_1_with_dates\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"nft_prediction_1_with_dates\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"nft_prediction_1_with_dates\"\n\nMore Information needed"
] |
e54a64ddd8de8a2d8e59ae7d01d3b007b9b78280 |
# Dataset Card for Evaluation run of jondurbin/cinematika-7b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jondurbin/cinematika-7b-v0.1](https://huggingface.co/jondurbin/cinematika-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__cinematika-7b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T16:28:44.189724](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__cinematika-7b-v0.1/blob/main/results_2024-01-05T16-28-44.189724.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5197396980815042,
"acc_stderr": 0.034372161413757284,
"acc_norm": 0.5250805424692492,
"acc_norm_stderr": 0.03514199991343521,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155034,
"mc2": 0.44466808310122014,
"mc2_stderr": 0.015193211017572112
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809174
},
"harness|hellaswag|10": {
"acc": 0.6138219478191596,
"acc_stderr": 0.004858771963468873,
"acc_norm": 0.8113921529575782,
"acc_norm_stderr": 0.003903972923680323
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.040633027314866704,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.040633027314866704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364763,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364763
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920945,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920945
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.029252823291803627,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.029252823291803627
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.025285585990017848,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.025285585990017848
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145665,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145665
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6,
"acc_stderr": 0.02100420126042007,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02100420126042007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6452991452991453,
"acc_stderr": 0.03134250486245402,
"acc_norm": 0.6452991452991453,
"acc_norm_stderr": 0.03134250486245402
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6704980842911877,
"acc_stderr": 0.016808322261740467,
"acc_norm": 0.6704980842911877,
"acc_norm_stderr": 0.016808322261740467
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5809248554913294,
"acc_stderr": 0.026564178111422632,
"acc_norm": 0.5809248554913294,
"acc_norm_stderr": 0.026564178111422632
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29497206703910617,
"acc_stderr": 0.01525193157920819,
"acc_norm": 0.29497206703910617,
"acc_norm_stderr": 0.01525193157920819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.027466610213140112,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.027466610213140112
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.02743162372241501,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.02743162372241501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543454,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543454
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42633637548891784,
"acc_stderr": 0.01263088477159969,
"acc_norm": 0.42633637548891784,
"acc_norm_stderr": 0.01263088477159969
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.020154685712590895,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.020154685712590895
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.04782001791380063,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.04782001791380063
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5522388059701493,
"acc_stderr": 0.035161847729521654,
"acc_norm": 0.5522388059701493,
"acc_norm_stderr": 0.035161847729521654
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.03722965741385539,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.03722965741385539
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155034,
"mc2": 0.44466808310122014,
"mc2_stderr": 0.015193211017572112
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126737
},
"harness|gsm8k|5": {
"acc": 0.17589082638362397,
"acc_stderr": 0.010487120635539625
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jondurbin__cinematika-7b-v0.1 | [
"region:us"
] | 2024-01-05T16:31:02+00:00 | {"pretty_name": "Evaluation run of jondurbin/cinematika-7b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/cinematika-7b-v0.1](https://huggingface.co/jondurbin/cinematika-7b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__cinematika-7b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T16:28:44.189724](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__cinematika-7b-v0.1/blob/main/results_2024-01-05T16-28-44.189724.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5197396980815042,\n \"acc_stderr\": 0.034372161413757284,\n \"acc_norm\": 0.5250805424692492,\n \"acc_norm_stderr\": 0.03514199991343521,\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155034,\n \"mc2\": 0.44466808310122014,\n \"mc2_stderr\": 0.015193211017572112\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809174\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6138219478191596,\n \"acc_stderr\": 0.004858771963468873,\n \"acc_norm\": 0.8113921529575782,\n \"acc_norm_stderr\": 0.003903972923680323\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.040633027314866704,\n \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.040633027314866704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.03804749744364763,\n \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.03804749744364763\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920945,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920945\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.47096774193548385,\n \"acc_stderr\": 0.028396016402761005,\n \"acc_norm\": 0.47096774193548385,\n \"acc_norm_stderr\": 0.028396016402761005\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.029252823291803627,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.029252823291803627\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.025285585990017848,\n \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.025285585990017848\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145665,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145665\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02100420126042007,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02100420126042007\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6813725490196079,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.048026946982589726,\n \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.048026946982589726\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6452991452991453,\n \"acc_stderr\": 0.03134250486245402,\n \"acc_norm\": 0.6452991452991453,\n \"acc_norm_stderr\": 0.03134250486245402\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6704980842911877,\n \"acc_stderr\": 0.016808322261740467,\n \"acc_norm\": 0.6704980842911877,\n \"acc_norm_stderr\": 0.016808322261740467\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.026564178111422632,\n \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.026564178111422632\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29497206703910617,\n \"acc_stderr\": 0.01525193157920819,\n \"acc_norm\": 0.29497206703910617,\n \"acc_norm_stderr\": 0.01525193157920819\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n \"acc_stderr\": 0.027466610213140112,\n \"acc_norm\": 0.6270096463022508,\n \"acc_norm_stderr\": 0.027466610213140112\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.02743162372241501,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.02743162372241501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543454,\n \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543454\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42633637548891784,\n \"acc_stderr\": 0.01263088477159969,\n \"acc_norm\": 0.42633637548891784,\n \"acc_norm_stderr\": 0.01263088477159969\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.020154685712590895,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.020154685712590895\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.04782001791380063,\n \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.04782001791380063\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5522388059701493,\n \"acc_stderr\": 0.035161847729521654,\n \"acc_norm\": 0.5522388059701493,\n \"acc_norm_stderr\": 0.035161847729521654\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.03722965741385539,\n \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.03722965741385539\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155034,\n \"mc2\": 0.44466808310122014,\n \"mc2_stderr\": 0.015193211017572112\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126737\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17589082638362397,\n \"acc_stderr\": 0.010487120635539625\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/cinematika-7b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|arc:challenge|25_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|gsm8k|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hellaswag|10_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T16-28-44.189724.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["**/details_harness|winogrande|5_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T16-28-44.189724.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T16_28_44.189724", "path": ["results_2024-01-05T16-28-44.189724.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T16-28-44.189724.parquet"]}]}]} | 2024-01-05T16:31:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/cinematika-7b-v0.1
Dataset automatically created during the evaluation run of model jondurbin/cinematika-7b-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T16:28:44.189724(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jondurbin/cinematika-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model jondurbin/cinematika-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T16:28:44.189724(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/cinematika-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model jondurbin/cinematika-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T16:28:44.189724(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/cinematika-7b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model jondurbin/cinematika-7b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T16:28:44.189724(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
baad42742cdfc7f35632fad0b4b9417e4264918f | # Dataset Card for "araproje_hellaswag_en_conf3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_hellaswag_en_conf3 | [
"region:us"
] | 2024-01-05T16:34:46+00:00 | {"dataset_info": {"features": [{"name": "ind", "dtype": "int32"}, {"name": "activity_label", "dtype": "string"}, {"name": "ctx_a", "dtype": "string"}, {"name": "ctx_b", "dtype": "string"}, {"name": "ctx", "dtype": "string"}, {"name": "endings", "sequence": "string"}, {"name": "source_id", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "split_type", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "validation", "num_bytes": 149738.0, "num_examples": 250}], "download_size": 80615, "dataset_size": 149738.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T16:34:48+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_hellaswag_en_conf3"
More Information needed | [
"# Dataset Card for \"araproje_hellaswag_en_conf3\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_hellaswag_en_conf3\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_hellaswag_en_conf3\"\n\nMore Information needed"
] |
fe5702b58b7b7a1f2f2a823641fa7f0d0c09fb6f |
# BEIR/MTEB hard negatives dataset
A dataset for quick evaluation of embedding models during their training.
The problem: running a full MTEB evaluation on a single GPU may take 10-20 hours. Most of this time is spent on embedding all 30M docs in all 10+ corpora. This dataset solves this problem by unwrapping a "retrieval" style benchmark into the "reranking" style:
* We compute embeddings for all documents in the corpora with the [intfloat/e5-base-v2](todo) model.
* For each corpus in BEIR/MTEB benchmark we build a Lucene index with text documents and their embeddings.
* For each eval query we do a hybrid [RRF](todo)-based retrieval for top-32 negatives
As BEIR testset is size-unbalanced (TREC-COVID is 42 queries, and MS MARCO is ~4000) we sample top-300 random queries from each dataset.
It takes around 30-60 seconds to perform eval using Nixietune on a single RTX 4090.
A dataset in a [nixietune](https://github.com/nixiesearch/nixietune) compatible format:
```json
{
"query": ")what was the immediate impact of the success of the manhattan project?",
"pos": [
"The presence of communication amid scientific minds was equally important to the success of the Manhattan Project as scientific intellect was. The only cloud hanging over the impressive achievement of the atomic researchers and engineers is what their success truly meant; hundreds of thousands of innocent lives obliterated."
],
"neg": [
"Abstract. The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs.",
"The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs."
]
}
```
## Usage
To use with HF datasets:
```bash
pip install datasets zstandard
```
```python
from datasets import load_dataset
data = load_dataset('nixiesearch/beir-eval-hard-negatives')
print(data["test"].features)
```
## License
Apache 2.0 | nixiesearch/beir-eval-hard-negatives | [
"task_categories:sentence-similarity",
"size_categories:100K<n<1M",
"source_datasets:BeIR",
"language:en",
"license:apache-2.0",
"text",
"region:us"
] | 2024-01-05T16:41:48+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "source_datasets": ["BeIR"], "task_categories": ["sentence-similarity"], "pretty_name": "MTEB/BEIR eval hard negatives", "tags": ["text"], "dataset_info": {"config_name": "default", "features": [{"name": "query", "dtype": "string"}, {"name": "positive", "sequence": "string"}, {"name": "negative", "sequence": "string"}], "splits": [{"name": "test", "num_bytes": 226515502, "num_examples": 3679}]}, "configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test/*"}]}], "train-eval-index": [{"config": "default", "task": "sentence-similarity", "splits": {"eval_split": "test"}}]} | 2024-01-05T23:48:54+00:00 | [] | [
"en"
] | TAGS
#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us
|
# BEIR/MTEB hard negatives dataset
A dataset for quick evaluation of embedding models during their training.
The problem: running a full MTEB evaluation on a single GPU may take 10-20 hours. Most of this time is spent on embedding all 30M docs in all 10+ corpora. This dataset solves this problem by unwrapping a "retrieval" style benchmark into the "reranking" style:
* We compute embeddings for all documents in the corpora with the intfloat/e5-base-v2 model.
* For each corpus in BEIR/MTEB benchmark we build a Lucene index with text documents and their embeddings.
* For each eval query we do a hybrid RRF-based retrieval for top-32 negatives
As BEIR testset is size-unbalanced (TREC-COVID is 42 queries, and MS MARCO is ~4000) we sample top-300 random queries from each dataset.
It takes around 30-60 seconds to perform eval using Nixietune on a single RTX 4090.
A dataset in a nixietune compatible format:
## Usage
To use with HF datasets:
## License
Apache 2.0 | [
"# BEIR/MTEB hard negatives dataset\n\nA dataset for quick evaluation of embedding models during their training.\n\nThe problem: running a full MTEB evaluation on a single GPU may take 10-20 hours. Most of this time is spent on embedding all 30M docs in all 10+ corpora. This dataset solves this problem by unwrapping a \"retrieval\" style benchmark into the \"reranking\" style:\n\n* We compute embeddings for all documents in the corpora with the intfloat/e5-base-v2 model.\n* For each corpus in BEIR/MTEB benchmark we build a Lucene index with text documents and their embeddings.\n* For each eval query we do a hybrid RRF-based retrieval for top-32 negatives\n\nAs BEIR testset is size-unbalanced (TREC-COVID is 42 queries, and MS MARCO is ~4000) we sample top-300 random queries from each dataset.\n\nIt takes around 30-60 seconds to perform eval using Nixietune on a single RTX 4090.\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
"TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n",
"# BEIR/MTEB hard negatives dataset\n\nA dataset for quick evaluation of embedding models during their training.\n\nThe problem: running a full MTEB evaluation on a single GPU may take 10-20 hours. Most of this time is spent on embedding all 30M docs in all 10+ corpora. This dataset solves this problem by unwrapping a \"retrieval\" style benchmark into the \"reranking\" style:\n\n* We compute embeddings for all documents in the corpora with the intfloat/e5-base-v2 model.\n* For each corpus in BEIR/MTEB benchmark we build a Lucene index with text documents and their embeddings.\n* For each eval query we do a hybrid RRF-based retrieval for top-32 negatives\n\nAs BEIR testset is size-unbalanced (TREC-COVID is 42 queries, and MS MARCO is ~4000) we sample top-300 random queries from each dataset.\n\nIt takes around 30-60 seconds to perform eval using Nixietune on a single RTX 4090.\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
54,
248,
12,
5
] | [
"passage: TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n# BEIR/MTEB hard negatives dataset\n\nA dataset for quick evaluation of embedding models during their training.\n\nThe problem: running a full MTEB evaluation on a single GPU may take 10-20 hours. Most of this time is spent on embedding all 30M docs in all 10+ corpora. This dataset solves this problem by unwrapping a \"retrieval\" style benchmark into the \"reranking\" style:\n\n* We compute embeddings for all documents in the corpora with the intfloat/e5-base-v2 model.\n* For each corpus in BEIR/MTEB benchmark we build a Lucene index with text documents and their embeddings.\n* For each eval query we do a hybrid RRF-based retrieval for top-32 negatives\n\nAs BEIR testset is size-unbalanced (TREC-COVID is 42 queries, and MS MARCO is ~4000) we sample top-300 random queries from each dataset.\n\nIt takes around 30-60 seconds to perform eval using Nixietune on a single RTX 4090.\n\nA dataset in a nixietune compatible format:## Usage\n\nTo use with HF datasets:## License\n\nApache 2.0"
] |
e8037f84ee107d7863f4c22881d603ca4563c45f |
This dataset contains: problems and solutions in Leetcode, crawled from: https://github.com/AnasImloul/Leetcode-Solutions
The format of data:
+ title: title of the problem
+ algo_input: the description of the problem
+ solution_py: the solution in Python
+ solution_js: the solution in Js
+ solution_java: the solution in Java
+ solution_c: the solution in C | khaimaitien/leetcode_problem_solution | [
"task_categories:text-generation",
"region:us"
] | 2024-01-05T17:02:36+00:00 | {"task_categories": ["text-generation"]} | 2024-01-05T17:24:42+00:00 | [] | [] | TAGS
#task_categories-text-generation #region-us
|
This dataset contains: problems and solutions in Leetcode, crawled from: URL
The format of data:
+ title: title of the problem
+ algo_input: the description of the problem
+ solution_py: the solution in Python
+ solution_js: the solution in Js
+ solution_java: the solution in Java
+ solution_c: the solution in C | [] | [
"TAGS\n#task_categories-text-generation #region-us \n"
] | [
17
] | [
"passage: TAGS\n#task_categories-text-generation #region-us \n"
] |
4ef27a7055b7a35ac123a9c6c5512e3b32f3533a |
# Dataset of surtr (Arknights)
This is the dataset of surtr (Arknights), containing 200 images and their tags.
The core tags of this character are `horns, long_hair, red_hair, purple_eyes, breasts, bangs, very_long_hair, hair_between_eyes, large_breasts, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Download | Description |
|:-----------------|---------:|:-----------------------------------------|:----------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| pruned | 200 | [Download](dataset-pruned.zip) | Raw data with meta information, core character tags pruned. |
| pruned-stage3 | 566 | [Download](dataset-pruned-stage3.zip) | 3-stage cropped raw data with meta information, core character tags pruned. |
| stage3-800 | 566 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 566 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-1200 | 552 | [Download](dataset-stage3-p480-1200.zip) | 3-stage cropped dataset with the area not less than 480x480 pixels. |
## List of Clusters
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------|
| 0 | 13 |  |  |  | demon_horns |
| 1 | 20 |  |  |  | demon_girl, demon_horns |
| 2 | 4 |  |  |  | demon_girl, demon_horns, slit_pupils |
| 3 | 3 |  |  |  | demon_girl, demon_horns, hair_intakes |
| 4 | 10 |  |  |  | demon_horns, hair_intakes |
| 5 | 11 |  |  |  | hair_intakes |
| 6 | 4 |  |  |  | hair_intakes, huge_breasts, grey_eyes |
| 7 | 3 |  |  |  | blue_eyes, hair_intakes |
| 8 | 5 |  |  |  | hair_rings |
| 9 | 3 |  |  |  | hair_ornament, hair_rings |
| 10 | 7 |  |  |  | hair_ornament, hair_rings, star_hair_ornament |
| 11 | 5 |  |  |  | hair_intakes, hair_ornament, hair_rings, star_hair_ornament |
| 12 | 4 |  |  |  | slit_pupils |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | demon_horns | demon_girl | slit_pupils | hair_intakes | huge_breasts | grey_eyes | blue_eyes | hair_rings | hair_ornament | star_hair_ornament |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:--------------|:-------------|:--------------|:---------------|:---------------|:------------|:------------|:-------------|:----------------|:---------------------|
| 0 | 13 |  |  |  | X | | | | | | | | | |
| 1 | 20 |  |  |  | X | X | | | | | | | | |
| 2 | 4 |  |  |  | X | X | X | | | | | | | |
| 3 | 3 |  |  |  | X | X | | X | | | | | | |
| 4 | 10 |  |  |  | X | | | X | | | | | | |
| 5 | 11 |  |  |  | | | | X | | | | | | |
| 6 | 4 |  |  |  | | | | X | X | X | | | | |
| 7 | 3 |  |  |  | | | | X | | | X | | | |
| 8 | 5 |  |  |  | | | | | | | | X | | |
| 9 | 3 |  |  |  | | | | | | | | X | X | |
| 10 | 7 |  |  |  | | | | | | | | X | X | X |
| 11 | 5 |  |  |  | | | | X | | | | X | X | X |
| 12 | 4 |  |  |  | | | X | | | | | | | |
| narugo/test_v1.5_ds | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-05T17:15:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-05T23:35:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of surtr (Arknights)
============================
This is the dataset of surtr (Arknights), containing 200 images and their tags.
The core tags of this character are 'horns, long\_hair, red\_hair, purple\_eyes, breasts, bangs, very\_long\_hair, hair\_between\_eyes, large\_breasts, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
List of Clusters
----------------
### Raw Text Version
### Table Version
| [
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Raw Text Version",
"### Table Version"
] | [
44,
5,
4
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Raw Text Version### Table Version"
] |
2018cda2d0995f55e8c1f63d496cd4152804a0bc | 1. The Dataset6K folder consist of two sub folders which includes Benign and Malignant data samples each having 3k data samples.
2. The Smart transformation folder consist of three subfolders which inlcudes tiny benign mole, large malignant moles and multiple moles each having advanced skin lesion augmentation results.
3. If you need to generate more data using Derm-T2IM model it can done by uploading the Derm-T2IM model on stable diffusion GUI which can be cloned from below Github Repo.
Link: https://github.com/AUTOMATIC1111/stable-diffusion-webui
| MAli-Farooq/Derm-T2IM-Dataset | [
"task_categories:text-to-image",
"size_categories:1K<n<10K",
"language:en",
"license:mit",
"medical",
"code",
"region:us"
] | 2024-01-05T17:21:16+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["1K<n<10K"], "task_categories": ["text-to-image"], "pretty_name": "DERM-T2IM Skin Lesion Dataset ", "tags": ["medical", "code"]} | 2024-01-08T12:34:40+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-mit #medical #code #region-us
| 1. The Dataset6K folder consist of two sub folders which includes Benign and Malignant data samples each having 3k data samples.
2. The Smart transformation folder consist of three subfolders which inlcudes tiny benign mole, large malignant moles and multiple moles each having advanced skin lesion augmentation results.
3. If you need to generate more data using Derm-T2IM model it can done by uploading the Derm-T2IM model on stable diffusion GUI which can be cloned from below Github Repo.
Link: URL
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-mit #medical #code #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-1K<n<10K #language-English #license-mit #medical #code #region-us \n"
] |
25cc11372546e6a213dff79b626a6ab1ec5e707d |
# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/TinyLlama-MoE-Chat](https://huggingface.co/maywell/TinyLlama-MoE-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T20:27:04.395344](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat/blob/main/results_2024-01-05T20-27-04.395344.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3007501755634551,
"acc_stderr": 0.03230151074369316,
"acc_norm": 0.3030043188652545,
"acc_norm_stderr": 0.0331221039767778,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.01494881267906214,
"mc2": 0.3935363377576707,
"mc2_stderr": 0.014416553400566495
},
"harness|arc:challenge|25": {
"acc": 0.3250853242320819,
"acc_stderr": 0.013688147309729129,
"acc_norm": 0.34726962457337884,
"acc_norm_stderr": 0.013913034529620444
},
"harness|hellaswag|10": {
"acc": 0.45180242979486157,
"acc_stderr": 0.004966544724452227,
"acc_norm": 0.5929097789285003,
"acc_norm_stderr": 0.0049028788067330365
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.14705882352941177,
"acc_stderr": 0.035240689515674495,
"acc_norm": 0.14705882352941177,
"acc_norm_stderr": 0.035240689515674495
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906863,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906863
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3096774193548387,
"acc_stderr": 0.026302774983517418,
"acc_norm": 0.3096774193548387,
"acc_norm_stderr": 0.026302774983517418
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.03756335775187896,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.03756335775187896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35353535353535354,
"acc_stderr": 0.03406086723547153,
"acc_norm": 0.35353535353535354,
"acc_norm_stderr": 0.03406086723547153
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.31088082901554404,
"acc_stderr": 0.03340361906276586,
"acc_norm": 0.31088082901554404,
"acc_norm_stderr": 0.03340361906276586
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.021916957709213796,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.021916957709213796
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275798,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275798
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882374,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882374
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25504587155963304,
"acc_stderr": 0.018688500856535843,
"acc_norm": 0.25504587155963304,
"acc_norm_stderr": 0.018688500856535843
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.030546745264953195,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.030546745264953195
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.03256685484460389,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.03256685484460389
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4177215189873418,
"acc_stderr": 0.032103530322412685,
"acc_norm": 0.4177215189873418,
"acc_norm_stderr": 0.032103530322412685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.31901840490797545,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.31901840490797545,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.32038834951456313,
"acc_stderr": 0.046202840822800406,
"acc_norm": 0.32038834951456313,
"acc_norm_stderr": 0.046202840822800406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.405982905982906,
"acc_stderr": 0.03217180182641087,
"acc_norm": 0.405982905982906,
"acc_norm_stderr": 0.03217180182641087
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.34099616858237547,
"acc_stderr": 0.016951781383223313,
"acc_norm": 0.34099616858237547,
"acc_norm_stderr": 0.016951781383223313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3104575163398693,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.3104575163398693,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.33762057877813506,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.33762057877813506,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3117283950617284,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.3117283950617284,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880596,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880596
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2835723598435463,
"acc_stderr": 0.011511900775968333,
"acc_norm": 0.2835723598435463,
"acc_norm_stderr": 0.011511900775968333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22794117647058823,
"acc_stderr": 0.025483081468029804,
"acc_norm": 0.22794117647058823,
"acc_norm_stderr": 0.025483081468029804
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.01824902441120766,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.01824902441120766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.031157150869355575,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.031157150869355575
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.01494881267906214,
"mc2": 0.3935363377576707,
"mc2_stderr": 0.014416553400566495
},
"harness|winogrande|5": {
"acc": 0.6219415943172849,
"acc_stderr": 0.013628165460523242
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416634
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat | [
"region:us"
] | 2024-01-05T17:25:02+00:00 | {"pretty_name": "Evaluation run of maywell/TinyLlama-MoE-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [maywell/TinyLlama-MoE-Chat](https://huggingface.co/maywell/TinyLlama-MoE-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-05T20:27:04.395344](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__TinyLlama-MoE-Chat/blob/main/results_2024-01-05T20-27-04.395344.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3007501755634551,\n \"acc_stderr\": 0.03230151074369316,\n \"acc_norm\": 0.3030043188652545,\n \"acc_norm_stderr\": 0.0331221039767778,\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.01494881267906214,\n \"mc2\": 0.3935363377576707,\n \"mc2_stderr\": 0.014416553400566495\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3250853242320819,\n \"acc_stderr\": 0.013688147309729129,\n \"acc_norm\": 0.34726962457337884,\n \"acc_norm_stderr\": 0.013913034529620444\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45180242979486157,\n \"acc_stderr\": 0.004966544724452227,\n \"acc_norm\": 0.5929097789285003,\n \"acc_norm_stderr\": 0.0049028788067330365\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.14705882352941177,\n \"acc_stderr\": 0.035240689515674495,\n \"acc_norm\": 0.14705882352941177,\n \"acc_norm_stderr\": 0.035240689515674495\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906863,\n \"acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906863\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3096774193548387,\n \"acc_stderr\": 0.026302774983517418,\n \"acc_norm\": 0.3096774193548387,\n \"acc_norm_stderr\": 0.026302774983517418\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.36363636363636365,\n \"acc_stderr\": 0.03756335775187896,\n \"acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.03756335775187896\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.31088082901554404,\n \"acc_stderr\": 0.03340361906276586,\n \"acc_norm\": 0.31088082901554404,\n \"acc_norm_stderr\": 0.03340361906276586\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.021916957709213796,\n \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.021916957709213796\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275798,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275798\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882374,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882374\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25504587155963304,\n \"acc_stderr\": 0.018688500856535843,\n \"acc_norm\": 0.25504587155963304,\n \"acc_norm_stderr\": 0.018688500856535843\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.030546745264953195,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.030546745264953195\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.03256685484460389,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.03256685484460389\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.4177215189873418,\n \"acc_stderr\": 0.032103530322412685,\n \"acc_norm\": 0.4177215189873418,\n \"acc_norm_stderr\": 0.032103530322412685\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.31901840490797545,\n \"acc_stderr\": 0.03661997551073836,\n \"acc_norm\": 0.31901840490797545,\n \"acc_norm_stderr\": 0.03661997551073836\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.32038834951456313,\n \"acc_stderr\": 0.046202840822800406,\n \"acc_norm\": 0.32038834951456313,\n \"acc_norm_stderr\": 0.046202840822800406\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.405982905982906,\n \"acc_stderr\": 0.03217180182641087,\n \"acc_norm\": 0.405982905982906,\n \"acc_norm_stderr\": 0.03217180182641087\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.34099616858237547,\n \"acc_stderr\": 0.016951781383223313,\n \"acc_norm\": 0.34099616858237547,\n \"acc_norm_stderr\": 0.016951781383223313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3104575163398693,\n \"acc_stderr\": 0.026493033225145894,\n \"acc_norm\": 0.3104575163398693,\n \"acc_norm_stderr\": 0.026493033225145894\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.33762057877813506,\n \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.33762057877813506,\n \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.3117283950617284,\n \"acc_stderr\": 0.02577311116963045,\n \"acc_norm\": 0.3117283950617284,\n \"acc_norm_stderr\": 0.02577311116963045\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880596,\n \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880596\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2835723598435463,\n \"acc_stderr\": 0.011511900775968333,\n \"acc_norm\": 0.2835723598435463,\n \"acc_norm_stderr\": 0.011511900775968333\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.22794117647058823,\n \"acc_stderr\": 0.025483081468029804,\n \"acc_norm\": 0.22794117647058823,\n \"acc_norm_stderr\": 0.025483081468029804\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.01824902441120766,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.01824902441120766\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.031157150869355575,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.031157150869355575\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n \"mc1_stderr\": 0.01494881267906214,\n \"mc2\": 0.3935363377576707,\n \"mc2_stderr\": 0.014416553400566495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6219415943172849,\n \"acc_stderr\": 0.013628165460523242\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416634\n }\n}\n```", "repo_url": "https://huggingface.co/maywell/TinyLlama-MoE-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|arc:challenge|25_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|arc:challenge|25_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|arc:challenge|25_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|gsm8k|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|gsm8k|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|gsm8k|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hellaswag|10_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hellaswag|10_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hellaswag|10_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T17-23-06.770918.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T20-25-22.413235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T20-27-04.395344.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["**/details_harness|winogrande|5_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["**/details_harness|winogrande|5_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["**/details_harness|winogrande|5_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-05T20-27-04.395344.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T17_23_06.770918", "path": ["results_2024-01-05T17-23-06.770918.parquet"]}, {"split": "2024_01_05T20_25_22.413235", "path": ["results_2024-01-05T20-25-22.413235.parquet"]}, {"split": "2024_01_05T20_27_04.395344", "path": ["results_2024-01-05T20-27-04.395344.parquet"]}, {"split": "latest", "path": ["results_2024-01-05T20-27-04.395344.parquet"]}]}]} | 2024-01-05T20:28:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat
Dataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-05T20:27:04.395344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T20:27:04.395344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-05T20:27:04.395344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of maywell/TinyLlama-MoE-Chat\n\n\n\nDataset automatically created during the evaluation run of model maywell/TinyLlama-MoE-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-05T20:27:04.395344(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
0eed3367191fbd44d67ecc0e7749ac6acbc28032 |
# Big Hard Negatives Dataset
A dataset for training embedding models for semantic search.
TODO: add desc
A dataset in a [nixietune](https://github.com/nixiesearch/nixietune) compatible format:
```json
{
"query": ")what was the immediate impact of the success of the manhattan project?",
"pos": [
"The presence of communication amid scientific minds was equally important to the success of the Manhattan Project as scientific intellect was. The only cloud hanging over the impressive achievement of the atomic researchers and engineers is what their success truly meant; hundreds of thousands of innocent lives obliterated."
],
"neg": [
"Abstract. The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs.",
"The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs."
]
}
```
## Usage
To use with HF datasets:
```bash
pip install datasets zstandard
```
```python
from datasets import load_dataset
data = load_dataset('nixiesearch/bfhardneg-small')
print(data["train"].features)
```
## License
Apache 2.0 | nixiesearch/bfhnd-small | [
"task_categories:sentence-similarity",
"size_categories:100K<n<1M",
"source_datasets:BeIR",
"language:en",
"license:apache-2.0",
"text",
"region:us"
] | 2024-01-05T17:34:07+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "source_datasets": ["BeIR"], "task_categories": ["sentence-similarity"], "pretty_name": "BFHND: Big Hard Negatives Dataset (1M sample)", "tags": ["text"], "dataset_info": {"config_name": "default", "features": [{"name": "query", "dtype": "string"}, {"name": "positive", "sequence": "string"}, {"name": "negative", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 226515502, "num_examples": 1000000}]}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train/*"}]}], "train-eval-index": [{"config": "default", "task": "sentence-similarity", "splits": {"train_split": "train"}}]} | 2024-01-05T17:35:36+00:00 | [] | [
"en"
] | TAGS
#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us
|
# Big Hard Negatives Dataset
A dataset for training embedding models for semantic search.
TODO: add desc
A dataset in a nixietune compatible format:
## Usage
To use with HF datasets:
## License
Apache 2.0 | [
"# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
"TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n",
"# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:",
"## Usage\n\nTo use with HF datasets:",
"## License\n\nApache 2.0"
] | [
54,
38,
12,
5
] | [
"passage: TAGS\n#task_categories-sentence-similarity #size_categories-100K<n<1M #source_datasets-BeIR #language-English #license-apache-2.0 #text #region-us \n# Big Hard Negatives Dataset\n\nA dataset for training embedding models for semantic search.\n\nTODO: add desc\n\nA dataset in a nixietune compatible format:## Usage\n\nTo use with HF datasets:## License\n\nApache 2.0"
] |
e2912d30b75eaf03ecbf1f2bba38f1a14e47c3d6 | # Dataset Card for "training_v0.0.5-public"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | male-2/training_v0.0.5-public | [
"region:us"
] | 2024-01-05T18:18:12+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "conversations", "list": [{"name": "from", "dtype": "string"}, {"name": "value", "dtype": "string"}]}, {"name": "emotion", "struct": [{"name": "joyful", "dtype": "bool"}, {"name": "sad", "dtype": "bool"}, {"name": "angry", "dtype": "bool"}]}, {"name": "example", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1018, "num_examples": 1}], "download_size": 9065, "dataset_size": 1018}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-06T10:23:25+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "training_v0.0.5-public"
More Information needed | [
"# Dataset Card for \"training_v0.0.5-public\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"training_v0.0.5-public\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"training_v0.0.5-public\"\n\nMore Information needed"
] |
8d3372a5874ade9e93be8b4444f0d31178e07502 |
- The dataset has been gather from assigments of [Klokánek](https://matematickyklokan.net/) competition from 2004-2022.
- I have done rule based filtering to filter-out picture related assigments
- The category denote the difficulty of the task, the range is elementary school to high-school. Check example test from Klokánek for more information.
- If you find an error in solution or find that the assigment is unsolvable, please contact me.
- If you have any question please contact me at [email protected]
- The dataset is realesed under non-comercial licence CC BY-NC-SA
Cite:
```
@misc{klokanek-dataset,
author = {Hynek Kydlíček, David Nocar et al.},
title = {Klokánek dataset},
year = {2023},
publisher = {Hynek Kydlíček},
doi = { 10.57967/hf/1608 },
url = {https://matematickyklokan.net/}
howpublished = "\url{https://huggingface.co/datasets/hynky/klokan-qa}"
}
``` | hynky/klokan-qa | [
"task_categories:question-answering",
"size_categories:n<1K",
"language:cs",
"license:cc",
"doi:10.57967/hf/1609",
"region:us"
] | 2024-01-05T18:19:46+00:00 | {"language": ["cs"], "license": "cc", "size_categories": ["n<1K"], "task_categories": ["question-answering"], "pretty_name": "KLOKAN - Czech matehmatical dataset", "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "answers.A", "dtype": "string"}, {"name": "answers.B", "dtype": "string"}, {"name": "answers.C", "dtype": "string"}, {"name": "answers.D", "dtype": "string"}, {"name": "answers.E", "dtype": "string"}, {"name": "correct_answer", "dtype": "string"}, {"name": "category", "dtype": "int64"}, {"name": "year", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 211043, "num_examples": 829}], "download_size": 126479, "dataset_size": 211043}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2024-01-14T22:27:19+00:00 | [] | [
"cs"
] | TAGS
#task_categories-question-answering #size_categories-n<1K #language-Czech #license-cc #doi-10.57967/hf/1609 #region-us
|
- The dataset has been gather from assigments of Klokánek competition from 2004-2022.
- I have done rule based filtering to filter-out picture related assigments
- The category denote the difficulty of the task, the range is elementary school to high-school. Check example test from Klokánek for more information.
- If you find an error in solution or find that the assigment is unsolvable, please contact me.
- If you have any question please contact me at URL@URL
- The dataset is realesed under non-comercial licence CC BY-NC-SA
Cite:
| [] | [
"TAGS\n#task_categories-question-answering #size_categories-n<1K #language-Czech #license-cc #doi-10.57967/hf/1609 #region-us \n"
] | [
51
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-n<1K #language-Czech #license-cc #doi-10.57967/hf/1609 #region-us \n"
] |
16f12268649d14b10bbbc9cff23e39920d65c3b4 | <div align="center">
# TinyLlama-1.1B
</div>
https://github.com/jzhang38/TinyLlama
The TinyLlama project aims to **pretrain** a **1.1B Llama model on 3 trillion tokens**. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs 🚀🚀. The training has started on 2023-09-01.
We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
#### This Model
This is the chat model finetuned on top of [TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T](https://huggingface.co/TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T). **We follow [HF's Zephyr](https://huggingface.co/HuggingFaceH4/zephyr-7b-alpha/edit/main/README.md)'s training recipe.** The model was " initially fine-tuned on a variant of the [`UltraChat`](https://huggingface.co/datasets/stingning/ultrachat) dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT.
We then further aligned the model with [🤗 TRL's](https://github.com/huggingface/trl) `DPOTrainer` on the [openbmb/UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset, which contain 64k prompts and model completions that are ranked by GPT-4."
#### How to use
You will need the transformers>=4.34
Do check the [TinyLlama](https://github.com/jzhang38/TinyLlama) github page for more information.
```python
# Install transformers from source - only needed for versions <= v4.34
# pip install git+https://github.com/huggingface/transformers.git
# pip install accelerate
import torch
from transformers import pipeline
pipe = pipeline("text-generation", model="TinyLlama/TinyLlama-1.1B-Chat-v1.0", torch_dtype=torch.bfloat16, device_map="auto")
# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
{
"role": "system",
"content": "You are a friendly chatbot who always responds in the style of a pirate",
},
{"role": "user", "content": "How many helicopters can a human eat in one sitting?"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])
# <|system|>
# You are a friendly chatbot who always responds in the style of a pirate.</s>
# <|user|>
# How many helicopters can a human eat in one sitting?</s>
# <|assistant|>
# ...
``` | ziffir/TinyLlama-1.1B-Chat-v1.0.1 | [
"language:en",
"license:apache-2.0",
"region:us"
] | 2024-01-05T18:58:35+00:00 | {"language": ["en"], "license": "apache-2.0", "datasets": ["cerebras/SlimPajama-627B", "bigcode/starcoderdata", "HuggingFaceH4/ultrachat_200k", "HuggingFaceH4/ultrafeedback_binarized"], "widget": [{"text": "<|system|>\nYou are a chatbot who can help code!</s>\n<|user|>\nWrite me a function to calculate the first 10 digits of the fibonacci sequence in Python and print it out to the CLI.</s>\n<|assistant|>\n"}]} | 2024-01-05T19:05:06+00:00 | [] | [
"en"
] | TAGS
#language-English #license-apache-2.0 #region-us
| <div align="center">
# TinyLlama-1.1B
</div>
URL
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of "just" 90 days using 16 A100-40G GPUs . The training has started on 2023-09-01.
We adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.
#### This Model
This is the chat model finetuned on top of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T. We follow HF's Zephyr's training recipe. The model was " initially fine-tuned on a variant of the 'UltraChat' dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT.
We then further aligned the model with TRL's 'DPOTrainer' on the openbmb/UltraFeedback dataset, which contain 64k prompts and model completions that are ranked by GPT-4."
#### How to use
You will need the transformers>=4.34
Do check the TinyLlama github page for more information.
| [
"# TinyLlama-1.1B\n</div>\n\nURL\n\nThe TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of \"just\" 90 days using 16 A100-40G GPUs . The training has started on 2023-09-01. \n\n\nWe adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.",
"#### This Model\nThis is the chat model finetuned on top of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T. We follow HF's Zephyr's training recipe. The model was \" initially fine-tuned on a variant of the 'UltraChat' dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT. \nWe then further aligned the model with TRL's 'DPOTrainer' on the openbmb/UltraFeedback dataset, which contain 64k prompts and model completions that are ranked by GPT-4.\"",
"#### How to use\nYou will need the transformers>=4.34\nDo check the TinyLlama github page for more information."
] | [
"TAGS\n#language-English #license-apache-2.0 #region-us \n",
"# TinyLlama-1.1B\n</div>\n\nURL\n\nThe TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of \"just\" 90 days using 16 A100-40G GPUs . The training has started on 2023-09-01. \n\n\nWe adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.",
"#### This Model\nThis is the chat model finetuned on top of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T. We follow HF's Zephyr's training recipe. The model was \" initially fine-tuned on a variant of the 'UltraChat' dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT. \nWe then further aligned the model with TRL's 'DPOTrainer' on the openbmb/UltraFeedback dataset, which contain 64k prompts and model completions that are ranked by GPT-4.\"",
"#### How to use\nYou will need the transformers>=4.34\nDo check the TinyLlama github page for more information."
] | [
18,
155,
146,
29
] | [
"passage: TAGS\n#language-English #license-apache-2.0 #region-us \n# TinyLlama-1.1B\n</div>\n\nURL\n\nThe TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. With some proper optimization, we can achieve this within a span of \"just\" 90 days using 16 A100-40G GPUs . The training has started on 2023-09-01. \n\n\nWe adopted exactly the same architecture and tokenizer as Llama 2. This means TinyLlama can be plugged and played in many open-source projects built upon Llama. Besides, TinyLlama is compact with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.#### This Model\nThis is the chat model finetuned on top of TinyLlama/TinyLlama-1.1B-intermediate-step-1431k-3T. We follow HF's Zephyr's training recipe. The model was \" initially fine-tuned on a variant of the 'UltraChat' dataset, which contains a diverse range of synthetic dialogues generated by ChatGPT. \nWe then further aligned the model with TRL's 'DPOTrainer' on the openbmb/UltraFeedback dataset, which contain 64k prompts and model completions that are ranked by GPT-4.\"#### How to use\nYou will need the transformers>=4.34\nDo check the TinyLlama github page for more information."
] |
8f8ba14b5641d267ff98e0d771501e4f6e56dc7b | license: mit | michaelmallari/us-census | [
"region:us"
] | 2024-01-05T19:00:38+00:00 | {} | 2024-01-05T19:25:00+00:00 | [] | [] | TAGS
#region-us
| license: mit | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
24f4a706135460a722deb8c81c838451b7ef344a | GPT 3.5 erzeugter RAG-Trainingsdatensatz.
```
prompt = """You have been assigned a retrieval task: {task}
Your mission is to write one text retrieval example for this task in JSON format. The JSON object must
contain the following keys:
- 'user_query': a string, a random user search query specified by the retrieval task.
- 'positive_document': a string, a relevant document for the user query.
- 'hard_negative_document': a string, a hard negative document that only appears relevant to the query.
Please adhere to the following guidelines:
- The 'user_query' should be {query_type}, {query_length}, {clarity}, and diverse in topic.
- Both the query and documents should be in German.
- The 'positive_document' should directly answer or be about the 'user_query'.
- The 'hard_negative_document' should be topically similar to the 'user_query' but should not answer or satisfy the query.
- The 'hard_negative_document' should be subtly irrelevant, meaning it appears to be related to the 'user_query' but does not provide a useful answer or information.
- Ensure that the documents are not copies of each other and contain unique content.
- The JSON object should be properly formatted and should validate against JSON standards.
Here is an example of how your JSON object might look for a retrieval task:
```json
{{
'user_query': '...',
'positive_document': '...',
'hard_negative_document': '...'
}}
```
Your output must always be just a JSON object only, do not explain yourself or output anything else. Always create it in German! You will get tiped 1000€ if you generate the right lengths!"""
``` | SebastianBodza/synthetischer_RAG_Datensatz_prototype | [
"region:us"
] | 2024-01-05T19:00:58+00:00 | {} | 2024-01-05T19:03:30+00:00 | [] | [] | TAGS
#region-us
| GPT 3.5 erzeugter RAG-Trainingsdatensatz.
json
{{
'user_query': '...',
'positive_document': '...',
'hard_negative_document': '...'
}}
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
568812b759acd5f43cb04794bc6c446545b578bb |
# IP-Adapter-FaceID Model Card
<div align="center">
[**Project Page**](https://ip-adapter.github.io) **|** [**Paper (ArXiv)**](https://arxiv.org/abs/2308.06721) **|** [**Code**](https://github.com/tencent-ailab/IP-Adapter)
</div>
---
## Introduction
An experimental version of IP-Adapter-FaceID: we use face ID embedding from a face recognition model instead of CLIP image embedding, additionally, we use LoRA to improve ID consistency. IP-Adapter-FaceID can generate various style images conditioned on a face with only text prompts.

**Update 2023/12/27**:
IP-Adapter-FaceID-Plus: face ID embedding (for face ID) + CLIP image embedding (for face structure)
<div align="center">

</div>
**Update 2023/12/28**:
IP-Adapter-FaceID-PlusV2: face ID embedding (for face ID) + controllable CLIP image embedding (for face structure)
You can adjust the weight of the face structure to get different generation!
<div align="center">

</div>
**Update 2024/01/04**:
IP-Adapter-FaceID-SDXL: An experimental SDXL version of IP-Adapter-FaceID
<div align="center">

</div>
## Usage
### IP-Adapter-FaceID
Firstly, you should use [insightface](https://github.com/deepinsight/insightface) to extract face ID embedding:
```python
import cv2
from insightface.app import FaceAnalysis
import torch
app = FaceAnalysis(name="buffalo_l", providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
app.prepare(ctx_id=0, det_size=(640, 640))
image = cv2.imread("person.jpg")
faces = app.get(image)
faceid_embeds = torch.from_numpy(faces[0].normed_embedding).unsqueeze(0)
```
Then, you can generate images conditioned on the face embeddings:
```python
import torch
from diffusers import StableDiffusionPipeline, DDIMScheduler, AutoencoderKL
from PIL import Image
from ip_adapter.ip_adapter_faceid import IPAdapterFaceID
base_model_path = "SG161222/Realistic_Vision_V4.0_noVAE"
vae_model_path = "stabilityai/sd-vae-ft-mse"
ip_ckpt = "ip-adapter-faceid_sd15.bin"
device = "cuda"
noise_scheduler = DDIMScheduler(
num_train_timesteps=1000,
beta_start=0.00085,
beta_end=0.012,
beta_schedule="scaled_linear",
clip_sample=False,
set_alpha_to_one=False,
steps_offset=1,
)
vae = AutoencoderKL.from_pretrained(vae_model_path).to(dtype=torch.float16)
pipe = StableDiffusionPipeline.from_pretrained(
base_model_path,
torch_dtype=torch.float16,
scheduler=noise_scheduler,
vae=vae,
feature_extractor=None,
safety_checker=None
)
# load ip-adapter
ip_model = IPAdapterFaceID(pipe, ip_ckpt, device)
# generate image
prompt = "photo of a woman in red dress in a garden"
negative_prompt = "monochrome, lowres, bad anatomy, worst quality, low quality, blurry"
images = ip_model.generate(
prompt=prompt, negative_prompt=negative_prompt, faceid_embeds=faceid_embeds, num_samples=4, width=512, height=768, num_inference_steps=30, seed=2023
)
```
### IP-Adapter-FaceID-SDXL
Firstly, you should use [insightface](https://github.com/deepinsight/insightface) to extract face ID embedding:
```python
import cv2
from insightface.app import FaceAnalysis
import torch
app = FaceAnalysis(name="buffalo_l", providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
app.prepare(ctx_id=0, det_size=(640, 640))
image = cv2.imread("person.jpg")
faces = app.get(image)
faceid_embeds = torch.from_numpy(faces[0].normed_embedding).unsqueeze(0)
```
Then, you can generate images conditioned on the face embeddings:
```python
import torch
from diffusers import StableDiffusionXLPipeline, DDIMScheduler
from PIL import Image
from ip_adapter.ip_adapter_faceid import IPAdapterFaceIDXL
base_model_path = "SG161222/RealVisXL_V3.0"
ip_ckpt = "ip-adapter-faceid_sdxl.bin"
device = "cuda"
noise_scheduler = DDIMScheduler(
num_train_timesteps=1000,
beta_start=0.00085,
beta_end=0.012,
beta_schedule="scaled_linear",
clip_sample=False,
set_alpha_to_one=False,
steps_offset=1,
)
pipe = StableDiffusionXLPipeline.from_pretrained(
base_model_path,
torch_dtype=torch.float16,
scheduler=noise_scheduler,
add_watermarker=False,
)
# load ip-adapter
ip_model = IPAdapterFaceIDXL(pipe, ip_ckpt, device)
# generate image
prompt = "A closeup shot of a beautiful Asian teenage girl in a white dress wearing small silver earrings in the garden, under the soft morning light"
negative_prompt = "monochrome, lowres, bad anatomy, worst quality, low quality, blurry"
images = ip_model.generate(
prompt=prompt, negative_prompt=negative_prompt, faceid_embeds=faceid_embeds, num_samples=2,
width=1024, height=1024,
num_inference_steps=30, guidance_scale=7.5, seed=2023
)
```
### IP-Adapter-FaceID-Plus
Firstly, you should use [insightface](https://github.com/deepinsight/insightface) to extract face ID embedding and face image:
```python
import cv2
from insightface.app import FaceAnalysis
from insightface.utils import face_align
import torch
app = FaceAnalysis(name="buffalo_l", providers=['CUDAExecutionProvider', 'CPUExecutionProvider'])
app.prepare(ctx_id=0, det_size=(640, 640))
image = cv2.imread("person.jpg")
faces = app.get(image)
faceid_embeds = torch.from_numpy(faces[0].normed_embedding).unsqueeze(0)
face_image = face_align.norm_crop(image, landmark=faces[0].kps, image_size=224) # you can also segment the face
```
Then, you can generate images conditioned on the face embeddings:
```python
import torch
from diffusers import StableDiffusionPipeline, DDIMScheduler, AutoencoderKL
from PIL import Image
from ip_adapter.ip_adapter_faceid import IPAdapterFaceIDPlus
v2 = False
base_model_path = "SG161222/Realistic_Vision_V4.0_noVAE"
vae_model_path = "stabilityai/sd-vae-ft-mse"
image_encoder_path = "laion/CLIP-ViT-H-14-laion2B-s32B-b79K"
ip_ckpt = "ip-adapter-faceid-plus_sd15.bin" if not v2 else "ip-adapter-faceid-plusv2_sd15.bin"
device = "cuda"
noise_scheduler = DDIMScheduler(
num_train_timesteps=1000,
beta_start=0.00085,
beta_end=0.012,
beta_schedule="scaled_linear",
clip_sample=False,
set_alpha_to_one=False,
steps_offset=1,
)
vae = AutoencoderKL.from_pretrained(vae_model_path).to(dtype=torch.float16)
pipe = StableDiffusionPipeline.from_pretrained(
base_model_path,
torch_dtype=torch.float16,
scheduler=noise_scheduler,
vae=vae,
feature_extractor=None,
safety_checker=None
)
# load ip-adapter
ip_model = IPAdapterFaceIDPlus(pipe, image_encoder_path, ip_ckpt, device)
# generate image
prompt = "photo of a woman in red dress in a garden"
negative_prompt = "monochrome, lowres, bad anatomy, worst quality, low quality, blurry"
images = ip_model.generate(
prompt=prompt, negative_prompt=negative_prompt, face_image=face_image, faceid_embeds=faceid_embeds, shortcut=v2, s_scale=1.0,
num_samples=4, width=512, height=768, num_inference_steps=30, seed=2023
)
```
## Limitations and Bias
- The model does not achieve perfect photorealism and ID consistency.
- The generalization of the model is limited due to limitations of the training data, base model and face recognition model.
## Non-commercial use
**This model is released exclusively for research purposes and is not intended for commercial use.**
| ziffir/IP-Adapter-FaceID-0.1 | [
"language:en",
"text-to-image",
"stable-diffusion",
"arxiv:2308.06721",
"region:us"
] | 2024-01-05T19:16:11+00:00 | {"language": ["en"], "tags": ["text-to-image", "stable-diffusion"], "library_name": "diffusers"} | 2024-01-06T00:42:28+00:00 | [
"2308.06721"
] | [
"en"
] | TAGS
#language-English #text-to-image #stable-diffusion #arxiv-2308.06721 #region-us
|
# IP-Adapter-FaceID Model Card
<div align="center">
Project Page | Paper (ArXiv) | Code
</div>
---
## Introduction
An experimental version of IP-Adapter-FaceID: we use face ID embedding from a face recognition model instead of CLIP image embedding, additionally, we use LoRA to improve ID consistency. IP-Adapter-FaceID can generate various style images conditioned on a face with only text prompts.
!results
Update 2023/12/27:
IP-Adapter-FaceID-Plus: face ID embedding (for face ID) + CLIP image embedding (for face structure)
<div align="center">
!results
</div>
Update 2023/12/28:
IP-Adapter-FaceID-PlusV2: face ID embedding (for face ID) + controllable CLIP image embedding (for face structure)
You can adjust the weight of the face structure to get different generation!
<div align="center">
!results
</div>
Update 2024/01/04:
IP-Adapter-FaceID-SDXL: An experimental SDXL version of IP-Adapter-FaceID
<div align="center">
!results
</div>
## Usage
### IP-Adapter-FaceID
Firstly, you should use insightface to extract face ID embedding:
Then, you can generate images conditioned on the face embeddings:
### IP-Adapter-FaceID-SDXL
Firstly, you should use insightface to extract face ID embedding:
Then, you can generate images conditioned on the face embeddings:
### IP-Adapter-FaceID-Plus
Firstly, you should use insightface to extract face ID embedding and face image:
Then, you can generate images conditioned on the face embeddings:
## Limitations and Bias
- The model does not achieve perfect photorealism and ID consistency.
- The generalization of the model is limited due to limitations of the training data, base model and face recognition model.
## Non-commercial use
This model is released exclusively for research purposes and is not intended for commercial use.
| [
"# IP-Adapter-FaceID Model Card\n\n\n<div align=\"center\">\n\nProject Page | Paper (ArXiv) | Code\n</div>\n\n---",
"## Introduction\n\nAn experimental version of IP-Adapter-FaceID: we use face ID embedding from a face recognition model instead of CLIP image embedding, additionally, we use LoRA to improve ID consistency. IP-Adapter-FaceID can generate various style images conditioned on a face with only text prompts. \n\n!results\n\n\nUpdate 2023/12/27: \n\nIP-Adapter-FaceID-Plus: face ID embedding (for face ID) + CLIP image embedding (for face structure)\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2023/12/28: \n\nIP-Adapter-FaceID-PlusV2: face ID embedding (for face ID) + controllable CLIP image embedding (for face structure)\n\nYou can adjust the weight of the face structure to get different generation!\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2024/01/04: \n\nIP-Adapter-FaceID-SDXL: An experimental SDXL version of IP-Adapter-FaceID\n\n<div align=\"center\"> \n\n!results\n</div>",
"## Usage",
"### IP-Adapter-FaceID\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"### IP-Adapter-FaceID-SDXL\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"### IP-Adapter-FaceID-Plus\n\nFirstly, you should use insightface to extract face ID embedding and face image:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"## Limitations and Bias\n- The model does not achieve perfect photorealism and ID consistency.\n- The generalization of the model is limited due to limitations of the training data, base model and face recognition model.",
"## Non-commercial use\nThis model is released exclusively for research purposes and is not intended for commercial use."
] | [
"TAGS\n#language-English #text-to-image #stable-diffusion #arxiv-2308.06721 #region-us \n",
"# IP-Adapter-FaceID Model Card\n\n\n<div align=\"center\">\n\nProject Page | Paper (ArXiv) | Code\n</div>\n\n---",
"## Introduction\n\nAn experimental version of IP-Adapter-FaceID: we use face ID embedding from a face recognition model instead of CLIP image embedding, additionally, we use LoRA to improve ID consistency. IP-Adapter-FaceID can generate various style images conditioned on a face with only text prompts. \n\n!results\n\n\nUpdate 2023/12/27: \n\nIP-Adapter-FaceID-Plus: face ID embedding (for face ID) + CLIP image embedding (for face structure)\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2023/12/28: \n\nIP-Adapter-FaceID-PlusV2: face ID embedding (for face ID) + controllable CLIP image embedding (for face structure)\n\nYou can adjust the weight of the face structure to get different generation!\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2024/01/04: \n\nIP-Adapter-FaceID-SDXL: An experimental SDXL version of IP-Adapter-FaceID\n\n<div align=\"center\"> \n\n!results\n</div>",
"## Usage",
"### IP-Adapter-FaceID\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"### IP-Adapter-FaceID-SDXL\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"### IP-Adapter-FaceID-Plus\n\nFirstly, you should use insightface to extract face ID embedding and face image:\n\n\n\nThen, you can generate images conditioned on the face embeddings:",
"## Limitations and Bias\n- The model does not achieve perfect photorealism and ID consistency.\n- The generalization of the model is limited due to limitations of the training data, base model and face recognition model.",
"## Non-commercial use\nThis model is released exclusively for research purposes and is not intended for commercial use."
] | [
32,
37,
255,
3,
43,
46,
48,
46,
25
] | [
"passage: TAGS\n#language-English #text-to-image #stable-diffusion #arxiv-2308.06721 #region-us \n# IP-Adapter-FaceID Model Card\n\n\n<div align=\"center\">\n\nProject Page | Paper (ArXiv) | Code\n</div>\n\n---## Introduction\n\nAn experimental version of IP-Adapter-FaceID: we use face ID embedding from a face recognition model instead of CLIP image embedding, additionally, we use LoRA to improve ID consistency. IP-Adapter-FaceID can generate various style images conditioned on a face with only text prompts. \n\n!results\n\n\nUpdate 2023/12/27: \n\nIP-Adapter-FaceID-Plus: face ID embedding (for face ID) + CLIP image embedding (for face structure)\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2023/12/28: \n\nIP-Adapter-FaceID-PlusV2: face ID embedding (for face ID) + controllable CLIP image embedding (for face structure)\n\nYou can adjust the weight of the face structure to get different generation!\n\n<div align=\"center\"> \n\n!results\n</div>\n\nUpdate 2024/01/04: \n\nIP-Adapter-FaceID-SDXL: An experimental SDXL version of IP-Adapter-FaceID\n\n<div align=\"center\"> \n\n!results\n</div>## Usage### IP-Adapter-FaceID\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:### IP-Adapter-FaceID-SDXL\n\nFirstly, you should use insightface to extract face ID embedding:\n\n\n\nThen, you can generate images conditioned on the face embeddings:### IP-Adapter-FaceID-Plus\n\nFirstly, you should use insightface to extract face ID embedding and face image:\n\n\n\nThen, you can generate images conditioned on the face embeddings:"
] |
1da31d7f399e5067d424c69c61ef11679aa4b40e | # Dataset Card for "araproje_mmlu_tr_s1_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_s1_split | [
"region:us"
] | 2024-01-05T19:17:35+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "dev", "num_bytes": 4218, "num_examples": 5}, {"name": "validation", "num_bytes": 133186, "num_examples": 245}], "download_size": 89474, "dataset_size": 137404}, "configs": [{"config_name": "default", "data_files": [{"split": "dev", "path": "data/dev-*"}, {"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T19:22:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_s1_split"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_s1_split\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_s1_split\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_s1_split\"\n\nMore Information needed"
] |
cdf403ce12f01022a0c36e584e588c0b9cebc4af | ## Dataset Description
- **Homepage:** https://image-net.org/index.php
- **Paper:** https://arxiv.org/abs/1409.0575
### Dataset Summary
ILSVRC 2012, commonly known as 'ImageNet' is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a "synonym set" or "synset". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.
💡 This dataset provides access to ImageNet (ILSVRC) 2012 which is the most commonly used **subset** of ImageNet. This dataset spans 1000 object classes and contains 1,281,167 training images, 50,000 validation images and 100,000 test images. The version also has the [patch](https://drive.google.com/file/d/16RYnHpVOW0XKCsn3G3S9GTHUyoV2-4WX/view) which fixes some of the corrupted test set images already applied. For full ImageNet dataset presented in [[2]](https://ieeexplore.ieee.org/abstract/document/5206848), please check the download section of the [main website](https://image-net.org/download-images.php).
### Data Splits
Unlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.
#### Train
* `imagenet1k-train-{0000..1023}.tar`
* 1281167 samples over 1024 shards
#### Validation
* `imagenet1k-validation-{0000..0063}.tar`
* 50000 samples over 63 shards
### Processing
The webdataset shards were converted from TFDS shards matching the splits in TFDS ImageNet-1k.
## Additional Information
### Dataset Curators
Authors of [[1]](https://arxiv.org/abs/1409.0575) and [[2]](https://ieeexplore.ieee.org/abstract/document/5206848):
- Olga Russakovsky
- Jia Deng
- Hao Su
- Jonathan Krause
- Sanjeev Satheesh
- Wei Dong
- Richard Socher
- Li-Jia Li
- Kai Li
- Sean Ma
- Zhiheng Huang
- Andrej Karpathy
- Aditya Khosla
- Michael Bernstein
- Alexander C Berg
- Li Fei-Fei
### Licensing Information
In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
1. The law of the State of New Jersey shall apply to all disputes under this agreement.
### Citation Information
```bibtex
@article{imagenet15russakovsky,
Author = {Olga Russakovsky and Jia Deng and Hao Su and Jonathan Krause and Sanjeev Satheesh and Sean Ma and Zhiheng Huang and Andrej Karpathy and Aditya Khosla and Michael Bernstein and Alexander C. Berg and Li Fei-Fei},
Title = { {ImageNet Large Scale Visual Recognition Challenge} },
Year = {2015},
journal = {International Journal of Computer Vision (IJCV)},
doi = {10.1007/s11263-015-0816-y},
volume={115},
number={3},
pages={211-252}
}
``` | timm/imagenet-1k-wds | [
"task_categories:image-classification",
"size_categories:100K<n<1M",
"license:other",
"webdataset",
"arxiv:1409.0575",
"region:us"
] | 2024-01-05T19:33:34+00:00 | {"license": "other", "size_categories": ["100K<n<1M"], "task_categories": ["image-classification"], "pretty_name": "ImageNet-1k", "license_name": "imagenet", "license_link": "https://www.image-net.org/download.php", "extra_gated_prompt": "By clicking on \u201cAccess repository\u201d below, you also agree to ImageNet Terms of Access:\n[RESEARCHER_FULLNAME] (the \"Researcher\") has requested permission to use the ImageNet database (the \"Database\") at Princeton University and Stanford University. In exchange for such permission, Researcher hereby agrees to the following terms and conditions:\n1. Researcher shall use the Database only for non-commercial research and educational purposes.\n2. Princeton University, Stanford University and Hugging Face make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.\n3. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, Stanford University and Hugging Face, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.\n4. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.\n5. Princeton University, Stanford University and Hugging Face reserve the right to terminate Researcher's access to the Database at any time.\n6. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.\n7. The law of the State of New Jersey shall apply to all disputes under this agreement.", "tags": ["webdataset"]} | 2024-01-07T18:12:43+00:00 | [
"1409.0575"
] | [] | TAGS
#task_categories-image-classification #size_categories-100K<n<1M #license-other #webdataset #arxiv-1409.0575 #region-us
| ## Dataset Description
- Homepage: URL
- Paper: URL
### Dataset Summary
ILSVRC 2012, commonly known as 'ImageNet' is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a "synonym set" or "synset". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.
This dataset provides access to ImageNet (ILSVRC) 2012 which is the most commonly used subset of ImageNet. This dataset spans 1000 object classes and contains 1,281,167 training images, 50,000 validation images and 100,000 test images. The version also has the patch which fixes some of the corrupted test set images already applied. For full ImageNet dataset presented in [[2]](URL please check the download section of the main website.
### Data Splits
Unlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.
#### Train
* 'imagenet1k-train-{0000..1023}.tar'
* 1281167 samples over 1024 shards
#### Validation
* 'imagenet1k-validation-{0000..0063}.tar'
* 50000 samples over 63 shards
### Processing
The webdataset shards were converted from TFDS shards matching the splits in TFDS ImageNet-1k.
## Additional Information
### Dataset Curators
Authors of [[1]](URL and [[2]](URL
- Olga Russakovsky
- Jia Deng
- Hao Su
- Jonathan Krause
- Sanjeev Satheesh
- Wei Dong
- Richard Socher
- Li-Jia Li
- Kai Li
- Sean Ma
- Zhiheng Huang
- Andrej Karpathy
- Aditya Khosla
- Michael Bernstein
- Alexander C Berg
- Li Fei-Fei
### Licensing Information
In exchange for permission to use the ImageNet database (the "Database") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:
1. Researcher shall use the Database only for non-commercial research and educational purposes.
1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.
1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.
1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.
1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.
1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.
1. The law of the State of New Jersey shall apply to all disputes under this agreement.
| [
"## Dataset Description\n\n- Homepage: URL\n- Paper: URL",
"### Dataset Summary\n\nILSVRC 2012, commonly known as 'ImageNet' is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a \"synonym set\" or \"synset\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.\n\n This dataset provides access to ImageNet (ILSVRC) 2012 which is the most commonly used subset of ImageNet. This dataset spans 1000 object classes and contains 1,281,167 training images, 50,000 validation images and 100,000 test images. The version also has the patch which fixes some of the corrupted test set images already applied. For full ImageNet dataset presented in [[2]](URL please check the download section of the main website.",
"### Data Splits\n\nUnlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.",
"#### Train\n* 'imagenet1k-train-{0000..1023}.tar'\n* 1281167 samples over 1024 shards",
"#### Validation\n* 'imagenet1k-validation-{0000..0063}.tar'\n* 50000 samples over 63 shards",
"### Processing\n\nThe webdataset shards were converted from TFDS shards matching the splits in TFDS ImageNet-1k.",
"## Additional Information",
"### Dataset Curators\n\nAuthors of [[1]](URL and [[2]](URL\n\n- Olga Russakovsky\n- Jia Deng\n- Hao Su\n- Jonathan Krause\n- Sanjeev Satheesh\n- Wei Dong\n- Richard Socher\n- Li-Jia Li\n- Kai Li\n- Sean Ma\n- Zhiheng Huang\n- Andrej Karpathy\n- Aditya Khosla\n- Michael Bernstein\n- Alexander C Berg\n- Li Fei-Fei",
"### Licensing Information\n\nIn exchange for permission to use the ImageNet database (the \"Database\") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:\n\n1. Researcher shall use the Database only for non-commercial research and educational purposes.\n1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.\n1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.\n1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.\n1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.\n1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.\n1. The law of the State of New Jersey shall apply to all disputes under this agreement."
] | [
"TAGS\n#task_categories-image-classification #size_categories-100K<n<1M #license-other #webdataset #arxiv-1409.0575 #region-us \n",
"## Dataset Description\n\n- Homepage: URL\n- Paper: URL",
"### Dataset Summary\n\nILSVRC 2012, commonly known as 'ImageNet' is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a \"synonym set\" or \"synset\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.\n\n This dataset provides access to ImageNet (ILSVRC) 2012 which is the most commonly used subset of ImageNet. This dataset spans 1000 object classes and contains 1,281,167 training images, 50,000 validation images and 100,000 test images. The version also has the patch which fixes some of the corrupted test set images already applied. For full ImageNet dataset presented in [[2]](URL please check the download section of the main website.",
"### Data Splits\n\nUnlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.",
"#### Train\n* 'imagenet1k-train-{0000..1023}.tar'\n* 1281167 samples over 1024 shards",
"#### Validation\n* 'imagenet1k-validation-{0000..0063}.tar'\n* 50000 samples over 63 shards",
"### Processing\n\nThe webdataset shards were converted from TFDS shards matching the splits in TFDS ImageNet-1k.",
"## Additional Information",
"### Dataset Curators\n\nAuthors of [[1]](URL and [[2]](URL\n\n- Olga Russakovsky\n- Jia Deng\n- Hao Su\n- Jonathan Krause\n- Sanjeev Satheesh\n- Wei Dong\n- Richard Socher\n- Li-Jia Li\n- Kai Li\n- Sean Ma\n- Zhiheng Huang\n- Andrej Karpathy\n- Aditya Khosla\n- Michael Bernstein\n- Alexander C Berg\n- Li Fei-Fei",
"### Licensing Information\n\nIn exchange for permission to use the ImageNet database (the \"Database\") at Princeton University and Stanford University, Researcher hereby agrees to the following terms and conditions:\n\n1. Researcher shall use the Database only for non-commercial research and educational purposes.\n1. Princeton University and Stanford University make no representations or warranties regarding the Database, including but not limited to warranties of non-infringement or fitness for a particular purpose.\n1. Researcher accepts full responsibility for his or her use of the Database and shall defend and indemnify the ImageNet team, Princeton University, and Stanford University, including their employees, Trustees, officers and agents, against any and all claims arising from Researcher's use of the Database, including but not limited to Researcher's use of any copies of copyrighted images that he or she may create from the Database.\n1. Researcher may provide research associates and colleagues with access to the Database provided that they first agree to be bound by these terms and conditions.\n1. Princeton University and Stanford University reserve the right to terminate Researcher's access to the Database at any time.\n1. If Researcher is employed by a for-profit, commercial entity, Researcher's employer shall also be bound by these terms and conditions, and Researcher hereby represents that he or she is fully authorized to enter into this agreement on behalf of such employer.\n1. The law of the State of New Jersey shall apply to all disputes under this agreement."
] | [
46,
12,
222,
50,
33,
35,
33,
5,
96,
327
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-100K<n<1M #license-other #webdataset #arxiv-1409.0575 #region-us \n## Dataset Description\n\n- Homepage: URL\n- Paper: URL### Dataset Summary\n\nILSVRC 2012, commonly known as 'ImageNet' is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a \"synonym set\" or \"synset\". There are more than 100,000 synsets in WordNet, majority of them are nouns (80,000+). ImageNet aims to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated.\n\n This dataset provides access to ImageNet (ILSVRC) 2012 which is the most commonly used subset of ImageNet. This dataset spans 1000 object classes and contains 1,281,167 training images, 50,000 validation images and 100,000 test images. The version also has the patch which fixes some of the corrupted test set images already applied. For full ImageNet dataset presented in [[2]](URL please check the download section of the main website.### Data Splits\n\nUnlike ImageNet-1k (ILSVRC 2012), the full ImageNet dataset has no defined splits. This subset includes a validation split consiting of 40 samples per 11821 classes.#### Train\n* 'imagenet1k-train-{0000..1023}.tar'\n* 1281167 samples over 1024 shards#### Validation\n* 'imagenet1k-validation-{0000..0063}.tar'\n* 50000 samples over 63 shards### Processing\n\nThe webdataset shards were converted from TFDS shards matching the splits in TFDS ImageNet-1k.## Additional Information"
] |
a6d4c51b0e27ab93f39c0c44a79b720c81741972 | # Dataset Card for "kdd210_hourly_24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | LeoTungAnh/kdd210_hourly_24 | [
"region:us"
] | 2024-01-05T19:57:39+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 18235479, "num_examples": 210}, {"name": "validation", "num_bytes": 18275799, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 47862588, "dataset_size": 54827397}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-05T19:58:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly_24"
More Information needed | [
"# Dataset Card for \"kdd210_hourly_24\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly_24\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly_24\"\n\nMore Information needed"
] |
a4f8a453891a11e2e0d3c464548fc130d4a4a4f3 | # Dataset Card for "processed_bert_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | canlinzhang/processed_bert_dataset | [
"region:us"
] | 2024-01-05T19:57:40+00:00 | {"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "token_type_ids", "sequence": "int8"}, {"name": "attention_mask", "sequence": "int8"}, {"name": "special_tokens_mask", "sequence": "int8"}], "splits": [{"name": "train", "num_bytes": 8473255200.0, "num_examples": 2353682}], "download_size": 2275917790, "dataset_size": 8473255200.0}} | 2024-01-05T20:07:47+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "processed_bert_dataset"
More Information needed | [
"# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"processed_bert_dataset\"\n\nMore Information needed"
] |
cc03c559b7c68696cc990feee2e5992a5539b23c | # Dataset Card for "kdd210_hourly_48"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | LeoTungAnh/kdd210_hourly_48 | [
"region:us"
] | 2024-01-05T20:02:27+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 18154839, "num_examples": 210}, {"name": "validation", "num_bytes": 18235479, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 47737715, "dataset_size": 54706437}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-05T20:03:51+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly_48"
More Information needed | [
"# Dataset Card for \"kdd210_hourly_48\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly_48\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly_48\"\n\nMore Information needed"
] |
16b905762da092cb2ca33609679d378e3ed186cc | # Dataset Card for "kdd210_hourly_96"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | LeoTungAnh/kdd210_hourly_96 | [
"region:us"
] | 2024-01-05T20:02:39+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 17993559, "num_examples": 210}, {"name": "validation", "num_bytes": 18154839, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 47500480, "dataset_size": 54464517}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-05T20:06:06+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly_96"
More Information needed | [
"# Dataset Card for \"kdd210_hourly_96\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly_96\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly_96\"\n\nMore Information needed"
] |
56df03061d4802a49fe8b072e03d784efa55ae3f | # Dataset Card for "kdd210_hourly_168"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | LeoTungAnh/kdd210_hourly_168 | [
"region:us"
] | 2024-01-05T20:02:49+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 17751639, "num_examples": 210}, {"name": "validation", "num_bytes": 18033879, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 47174334, "dataset_size": 54101637}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-05T20:08:04+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly_168"
More Information needed | [
"# Dataset Card for \"kdd210_hourly_168\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly_168\"\n\nMore Information needed"
] | [
6,
19
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly_168\"\n\nMore Information needed"
] |
e78c83e83338a0e787504c39a5a7f7c66071ba53 | # Dataset Card for "kdd210_hourly_336"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | LeoTungAnh/kdd210_hourly_336 | [
"region:us"
] | 2024-01-05T20:03:01+00:00 | {"dataset_info": {"features": [{"name": "start", "dtype": "timestamp[s]"}, {"name": "feat_static_cat", "sequence": "uint64"}, {"name": "feat_dynamic_real", "sequence": {"sequence": "float32"}}, {"name": "item_id", "dtype": "string"}, {"name": "target", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 17187159, "num_examples": 210}, {"name": "validation", "num_bytes": 17751639, "num_examples": 210}, {"name": "test", "num_bytes": 18316119, "num_examples": 210}], "download_size": 46384794, "dataset_size": 53254917}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2024-01-05T20:09:19+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "kdd210_hourly_336"
More Information needed | [
"# Dataset Card for \"kdd210_hourly_336\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"kdd210_hourly_336\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"kdd210_hourly_336\"\n\nMore Information needed"
] |
379aee5b39d99b1f90ce5f24338861d8f1017456 | # Dataset Card for "araproje_mmlu_tr_conf1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf1 | [
"region:us"
] | 2024-01-05T20:03:09+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 82980, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T20:32:05+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_conf1"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_conf1\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_conf1\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_conf1\"\n\nMore Information needed"
] |
80360309af50983a3e24a6007677f5f7735cad08 | # Dataset Card for "araproje_mmlu_tr_conf2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | ibranze/araproje_mmlu_tr_conf2 | [
"region:us"
] | 2024-01-05T20:05:53+00:00 | {"dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "subject", "dtype": "string"}, {"name": "choices", "sequence": "string"}, {"name": "answer", "dtype": {"class_label": {"names": {"0": "A", "1": "B", "2": "C", "3": "D"}}}}], "splits": [{"name": "validation", "num_bytes": 137404.0, "num_examples": 250}], "download_size": 82743, "dataset_size": 137404.0}, "configs": [{"config_name": "default", "data_files": [{"split": "validation", "path": "data/validation-*"}]}]} | 2024-01-05T20:32:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "araproje_mmlu_tr_conf2"
More Information needed | [
"# Dataset Card for \"araproje_mmlu_tr_conf2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"araproje_mmlu_tr_conf2\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"araproje_mmlu_tr_conf2\"\n\nMore Information needed"
] |
6dc815789aee8e546f9d60b191676056a9a400d7 |
# Dataset Card for Evaluation run of FinancialSupport/saiga-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FinancialSupport/saiga-7b](https://huggingface.co/FinancialSupport/saiga-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FinancialSupport__saiga-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T09:45:13.910276](https://huggingface.co/datasets/open-llm-leaderboard/details_FinancialSupport__saiga-7b/blob/main/results_2024-01-06T09-45-13.910276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6166185262751313,
"acc_stderr": 0.032849020143055976,
"acc_norm": 0.6205158254144141,
"acc_norm_stderr": 0.033513597535050406,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5499412348890741,
"mc2_stderr": 0.01546920530349635
},
"harness|arc:challenge|25": {
"acc": 0.5930034129692833,
"acc_stderr": 0.01435639941800912,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.0140978106780422
},
"harness|hellaswag|10": {
"acc": 0.6319458275243975,
"acc_stderr": 0.004812905279066438,
"acc_norm": 0.8314080860386377,
"acc_norm_stderr": 0.0037362592995204874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895536,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895536
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419034,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419034
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137285,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137285
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462846,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.02614848346915332,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.02614848346915332
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131157,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131157
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936066,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936066
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296417,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296417
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.01653117099327888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.01653117099327888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.02645722506781103,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.02645722506781103
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.02968010556502904,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.02968010556502904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.01268397251359881,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.01268397251359881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6225490196078431,
"acc_stderr": 0.019610851474880286,
"acc_norm": 0.6225490196078431,
"acc_norm_stderr": 0.019610851474880286
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.0294752502360172,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.0294752502360172
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774707,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774707
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5499412348890741,
"mc2_stderr": 0.01546920530349635
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.45109931766489764,
"acc_stderr": 0.013706458809664819
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FinancialSupport__saiga-7b | [
"region:us"
] | 2024-01-05T20:19:06+00:00 | {"pretty_name": "Evaluation run of FinancialSupport/saiga-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [FinancialSupport/saiga-7b](https://huggingface.co/FinancialSupport/saiga-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FinancialSupport__saiga-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-06T09:45:13.910276](https://huggingface.co/datasets/open-llm-leaderboard/details_FinancialSupport__saiga-7b/blob/main/results_2024-01-06T09-45-13.910276.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6166185262751313,\n \"acc_stderr\": 0.032849020143055976,\n \"acc_norm\": 0.6205158254144141,\n \"acc_norm_stderr\": 0.033513597535050406,\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5499412348890741,\n \"mc2_stderr\": 0.01546920530349635\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5930034129692833,\n \"acc_stderr\": 0.01435639941800912,\n \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.0140978106780422\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6319458275243975,\n \"acc_stderr\": 0.004812905279066438,\n \"acc_norm\": 0.8314080860386377,\n \"acc_norm_stderr\": 0.0037362592995204874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895536,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895536\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419034,\n \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419034\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137285,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137285\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.024892469172462846,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.024892469172462846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.02614848346915332,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.02614848346915332\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131157,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131157\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936066,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936066\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n \"acc_stderr\": 0.014283378044296417,\n \"acc_norm\": 0.8007662835249042,\n \"acc_norm_stderr\": 0.014283378044296417\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n \"acc_stderr\": 0.01653117099327888,\n \"acc_norm\": 0.4245810055865922,\n \"acc_norm_stderr\": 0.01653117099327888\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n \"acc_stderr\": 0.02645722506781103,\n \"acc_norm\": 0.6816720257234726,\n \"acc_norm_stderr\": 0.02645722506781103\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.02968010556502904,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.02968010556502904\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n \"acc_stderr\": 0.01268397251359881,\n \"acc_norm\": 0.44198174706649285,\n \"acc_norm_stderr\": 0.01268397251359881\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928006,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928006\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6225490196078431,\n \"acc_stderr\": 0.019610851474880286,\n \"acc_norm\": 0.6225490196078431,\n \"acc_norm_stderr\": 0.019610851474880286\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n \"acc_stderr\": 0.0294752502360172,\n \"acc_norm\": 0.7761194029850746,\n \"acc_norm_stderr\": 0.0294752502360172\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774707,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774707\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5499412348890741,\n \"mc2_stderr\": 0.01546920530349635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45109931766489764,\n \"acc_stderr\": 0.013706458809664819\n }\n}\n```", "repo_url": "https://huggingface.co/FinancialSupport/saiga-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|arc:challenge|25_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|arc:challenge|25_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|gsm8k|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|gsm8k|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hellaswag|10_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hellaswag|10_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T20-16-41.982110.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-06T09-45-13.910276.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["**/details_harness|winogrande|5_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["**/details_harness|winogrande|5_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-06T09-45-13.910276.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T20_16_41.982110", "path": ["results_2024-01-05T20-16-41.982110.parquet"]}, {"split": "2024_01_06T09_45_13.910276", "path": ["results_2024-01-06T09-45-13.910276.parquet"]}, {"split": "latest", "path": ["results_2024-01-06T09-45-13.910276.parquet"]}]}]} | 2024-01-06T09:47:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FinancialSupport/saiga-7b
Dataset automatically created during the evaluation run of model FinancialSupport/saiga-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-06T09:45:13.910276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FinancialSupport/saiga-7b\n\n\n\nDataset automatically created during the evaluation run of model FinancialSupport/saiga-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-06T09:45:13.910276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FinancialSupport/saiga-7b\n\n\n\nDataset automatically created during the evaluation run of model FinancialSupport/saiga-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-06T09:45:13.910276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FinancialSupport/saiga-7b\n\n\n\nDataset automatically created during the evaluation run of model FinancialSupport/saiga-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-06T09:45:13.910276(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
b749189185866990af77b27be61964fdf4083f7b |
# Winograd Schema Challenge examples included in the SuperGLUE Benchmark
Specifically: The wsc and wsc.fixed datasets from the HuggingFace "super_glue" repository.
### Data Fields
- **`text`** (*`str`*): The text of the schema.
- **`span1_index`** (*`int`*): Starting word index of first entity.
- **`span2_index`** (*`int`*): Starting word index of second entity.
- **`span1_text`** (*`str`*): Textual representation of first entity.
- **`span2_text`** (*`str`*): Textual representation of second entity.
- **`idx`** (*`int`*): Index of the example in the dataset.
- **`label`** (*`bool`*): True if the two spans corefer.
"""The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context."""
```
@inproceedings{NEURIPS2019_4496bf24,
author = {Wang, Alex and Pruksachatkun, Yada and Nangia, Nikita and Singh, Amanpreet and Michael, Julian and Hill, Felix and Levy, Omer and Bowman, Samuel},
booktitle = {Advances in Neural Information Processing Systems},
editor = {H. Wallach and H. Larochelle and A. Beygelzimer and F. d\textquotesingle Alch\'{e}-Buc and E. Fox and R. Garnett},
pages = {},
publisher = {Curran Associates, Inc.},
title = {SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems},
url = {https://proceedings.neurips.cc/paper_files/paper/2019/file/4496bf24afe7fab6f046bf4923da8de6-Paper.pdf},
volume = {32},
year = {2019}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. | coref-data/superglue_wsc_raw | [
"license:unknown",
"region:us"
] | 2024-01-05T20:28:06+00:00 | {"license": "unknown", "configs": [{"config_name": "wsc.fixed", "data_files": [{"split": "train", "path": "wsc.fixed/train-*.parquet"}, {"split": "validation", "path": "wsc.fixed/validation-*.parquet"}, {"split": "test", "path": "wsc.fixed/test-*.parquet"}]}, {"config_name": "wsc", "data_files": [{"split": "train", "path": "wsc/train-*.parquet"}, {"split": "validation", "path": "wsc/validation-*.parquet"}, {"split": "test", "path": "wsc/test-*.parquet"}]}]} | 2024-01-19T00:03:38+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
# Winograd Schema Challenge examples included in the SuperGLUE Benchmark
Specifically: The wsc and URL datasets from the HuggingFace "super_glue" repository.
### Data Fields
- 'text' (*'str'*): The text of the schema.
- 'span1_index' (*'int'*): Starting word index of first entity.
- 'span2_index' (*'int'*): Starting word index of second entity.
- 'span1_text' (*'str'*): Textual representation of first entity.
- 'span2_text' (*'str'*): Textual representation of second entity.
- 'idx' (*'int'*): Index of the example in the dataset.
- 'label' (*'bool'*): True if the two spans corefer.
"""The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context."""
### Contributions
Thanks to @thomwolf, @lewtun, @patrickvonplaten for adding this dataset. | [
"# Winograd Schema Challenge examples included in the SuperGLUE Benchmark\n\nSpecifically: The wsc and URL datasets from the HuggingFace \"super_glue\" repository.",
"### Data Fields\n\n- 'text' (*'str'*): The text of the schema.\n- 'span1_index' (*'int'*): Starting word index of first entity.\n- 'span2_index' (*'int'*): Starting word index of second entity.\n- 'span1_text' (*'str'*): Textual representation of first entity.\n- 'span2_text' (*'str'*): Textual representation of second entity.\n- 'idx' (*'int'*): Index of the example in the dataset.\n- 'label' (*'bool'*): True if the two spans corefer.\n\n\"\"\"The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context.\"\"\"",
"### Contributions\n\nThanks to @thomwolf, @lewtun, @patrickvonplaten for adding this dataset."
] | [
"TAGS\n#license-unknown #region-us \n",
"# Winograd Schema Challenge examples included in the SuperGLUE Benchmark\n\nSpecifically: The wsc and URL datasets from the HuggingFace \"super_glue\" repository.",
"### Data Fields\n\n- 'text' (*'str'*): The text of the schema.\n- 'span1_index' (*'int'*): Starting word index of first entity.\n- 'span2_index' (*'int'*): Starting word index of second entity.\n- 'span1_text' (*'str'*): Textual representation of first entity.\n- 'span2_text' (*'str'*): Textual representation of second entity.\n- 'idx' (*'int'*): Index of the example in the dataset.\n- 'label' (*'bool'*): True if the two spans corefer.\n\n\"\"\"The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context.\"\"\"",
"### Contributions\n\nThanks to @thomwolf, @lewtun, @patrickvonplaten for adding this dataset."
] | [
13,
45,
208,
28
] | [
"passage: TAGS\n#license-unknown #region-us \n# Winograd Schema Challenge examples included in the SuperGLUE Benchmark\n\nSpecifically: The wsc and URL datasets from the HuggingFace \"super_glue\" repository.### Data Fields\n\n- 'text' (*'str'*): The text of the schema.\n- 'span1_index' (*'int'*): Starting word index of first entity.\n- 'span2_index' (*'int'*): Starting word index of second entity.\n- 'span1_text' (*'str'*): Textual representation of first entity.\n- 'span2_text' (*'str'*): Textual representation of second entity.\n- 'idx' (*'int'*): Index of the example in the dataset.\n- 'label' (*'bool'*): True if the two spans corefer.\n\n\"\"\"The primary SuperGLUE tasks are built on and derived from existing datasets. We refer users to the original licenses accompanying each dataset, but it is our understanding that these licenses allow for their use and redistribution in a research context.\"\"\"### Contributions\n\nThanks to @thomwolf, @lewtun, @patrickvonplaten for adding this dataset."
] |
050bdf223d47be55271f9a98f4f94725c7719f8a |
# Pronoun Disambiguation Problems (PDP) from the 2016 WSC as hosted by Ernest Davis
60 pronoun disambiguation problems from https://cs.nyu.edu/faculty/davise/papers/WinogradSchemas/WS.html
### Data Fields
- `text` (str): The text sequence
- `options` (list[str]): The two entity options that the pronoun may be referring to
- `label` (int): The index of the correct option in the `options` field
- `pronoun` (str): The pronoun in the sequence to be resolved
- `pronoun_loc` (int): The starting position of the pronoun in the sequence
- `quote` (str): The substr with the key action or context surrounding the pronoun
- `quote_loc` (int): The starting position of the quote in the sequence
- `source` (str): A description of the source who contributed the example
```
@article{Davis_Morgenstern_Ortiz_2017,
title = {The First Winograd Schema Challenge at IJCAI-16},
author = {Davis, Ernest and Morgenstern, Leora and Ortiz, Charles L.},
year = 2017,
month = {Oct.},
journal = {AI Magazine},
volume = 38,
number = 3,
pages = {97--98},
doi = {10.1609/aimag.v38i4.2734},
url = {https://ojs.aaai.org/aimagazine/index.php/aimagazine/article/view/2734},
abstractnote = {The first Winograd Schema Challenge was held in New York, New York, as part of the International Joint Conference on Artificial Intelligence. The challenge was original conceived by Hector Levesque as an alternative to the Turing Test. This report details the results of this first challenge.}
}
``` | coref-data/davis_pdp_raw | [
"license:cc-by-4.0",
"region:us"
] | 2024-01-05T20:55:12+00:00 | {"license": "cc-by-4.0", "dataset_info": {"config_name": "davis_pdp", "features": [{"name": "text", "dtype": "string"}, {"name": "pronoun", "dtype": "string"}, {"name": "pronoun_loc", "dtype": "int32"}, {"name": "quote", "dtype": "string"}, {"name": "quote_loc", "dtype": "int32"}, {"name": "options", "sequence": "string"}, {"name": "label", "dtype": "int32"}, {"name": "humanSubjects", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 20098, "num_examples": 60}], "download_size": 14796, "dataset_size": 20098}, "configs": [{"config_name": "davis_pdp", "data_files": [{"split": "test", "path": "davis_pdp/test-*"}]}]} | 2024-01-24T21:09:51+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #region-us
|
# Pronoun Disambiguation Problems (PDP) from the 2016 WSC as hosted by Ernest Davis
60 pronoun disambiguation problems from URL
### Data Fields
- 'text' (str): The text sequence
- 'options' (list[str]): The two entity options that the pronoun may be referring to
- 'label' (int): The index of the correct option in the 'options' field
- 'pronoun' (str): The pronoun in the sequence to be resolved
- 'pronoun_loc' (int): The starting position of the pronoun in the sequence
- 'quote' (str): The substr with the key action or context surrounding the pronoun
- 'quote_loc' (int): The starting position of the quote in the sequence
- 'source' (str): A description of the source who contributed the example
| [
"# Pronoun Disambiguation Problems (PDP) from the 2016 WSC as hosted by Ernest Davis\n\n\n60 pronoun disambiguation problems from URL",
"### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example"
] | [
"TAGS\n#license-cc-by-4.0 #region-us \n",
"# Pronoun Disambiguation Problems (PDP) from the 2016 WSC as hosted by Ernest Davis\n\n\n60 pronoun disambiguation problems from URL",
"### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example"
] | [
15,
35,
163
] | [
"passage: TAGS\n#license-cc-by-4.0 #region-us \n# Pronoun Disambiguation Problems (PDP) from the 2016 WSC as hosted by Ernest Davis\n\n\n60 pronoun disambiguation problems from URL### Data Fields\n\n- 'text' (str): The text sequence\n- 'options' (list[str]): The two entity options that the pronoun may be referring to\n- 'label' (int): The index of the correct option in the 'options' field\n- 'pronoun' (str): The pronoun in the sequence to be resolved\n- 'pronoun_loc' (int): The starting position of the pronoun in the sequence\n- 'quote' (str): The substr with the key action or context surrounding the pronoun\n- 'quote_loc' (int): The starting position of the quote in the sequence\n- 'source' (str): A description of the source who contributed the example"
] |
f830a0f272cca6429a6d127a4d1d20477932e05e |
# Dataset Card for Evaluation run of UCLA-AGI/test-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [UCLA-AGI/test-test](https://huggingface.co/UCLA-AGI/test-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_UCLA-AGI__test-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T03:24:05.759125](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test/blob/main/results_2024-01-06T03-24-05.759125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6137450258527077,
"acc_stderr": 0.03285635549826058,
"acc_norm": 0.619742776234521,
"acc_norm_stderr": 0.03352418559465981,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5774588897502617,
"mc2_stderr": 0.015854382987078947
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6748655646285601,
"acc_stderr": 0.004674677287148618,
"acc_norm": 0.858195578570006,
"acc_norm_stderr": 0.003481364840770976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.03910525752849724,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.03910525752849724
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896079,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896079
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114968,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114968
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310267,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.02641560191438899,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.02641560191438899
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646671,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646671
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.01965992249362335,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.01965992249362335
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111844,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111844
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5774588897502617,
"mc2_stderr": 0.015854382987078947
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836675
},
"harness|gsm8k|5": {
"acc": 0.32752084912812734,
"acc_stderr": 0.012927102210426476
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_UCLA-AGI__test-test | [
"region:us"
] | 2024-01-05T20:58:04+00:00 | {"pretty_name": "Evaluation run of UCLA-AGI/test-test", "dataset_summary": "Dataset automatically created during the evaluation run of model [UCLA-AGI/test-test](https://huggingface.co/UCLA-AGI/test-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_UCLA-AGI__test-test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-06T03:24:05.759125](https://huggingface.co/datasets/open-llm-leaderboard/details_UCLA-AGI__test-test/blob/main/results_2024-01-06T03-24-05.759125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6137450258527077,\n \"acc_stderr\": 0.03285635549826058,\n \"acc_norm\": 0.619742776234521,\n \"acc_norm_stderr\": 0.03352418559465981,\n \"mc1\": 0.4112607099143207,\n \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5774588897502617,\n \"mc2_stderr\": 0.015854382987078947\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6748655646285601,\n \"acc_stderr\": 0.004674677287148618,\n \"acc_norm\": 0.858195578570006,\n \"acc_norm_stderr\": 0.003481364840770976\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.03910525752849724,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.03910525752849724\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896079,\n \"acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896079\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114968,\n \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114968\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310267,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310267\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.02641560191438899,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.02641560191438899\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n \"acc_stderr\": 0.012682016335646671,\n \"acc_norm\": 0.44132985658409385,\n \"acc_norm_stderr\": 0.012682016335646671\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.01965992249362335,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.01965992249362335\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111844,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111844\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5774588897502617,\n \"mc2_stderr\": 0.015854382987078947\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.32752084912812734,\n \"acc_stderr\": 0.012927102210426476\n }\n}\n```", "repo_url": "https://huggingface.co/UCLA-AGI/test-test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|arc:challenge|25_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|arc:challenge|25_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|arc:challenge|25_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|arc:challenge|25_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|gsm8k|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|gsm8k|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|gsm8k|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|gsm8k|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hellaswag|10_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hellaswag|10_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hellaswag|10_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hellaswag|10_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T20-55-50.355988.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-05T21-43-53.748756.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-06T00-47-43.486217.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["**/details_harness|winogrande|5_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["**/details_harness|winogrande|5_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["**/details_harness|winogrande|5_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["**/details_harness|winogrande|5_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-06T03-24-05.759125.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_05T20_55_50.355988", "path": ["results_2024-01-05T20-55-50.355988.parquet"]}, {"split": "2024_01_05T21_43_53.748756", "path": ["results_2024-01-05T21-43-53.748756.parquet"]}, {"split": "2024_01_06T00_47_43.486217", "path": ["results_2024-01-06T00-47-43.486217.parquet"]}, {"split": "2024_01_06T03_24_05.759125", "path": ["results_2024-01-06T03-24-05.759125.parquet"]}, {"split": "latest", "path": ["results_2024-01-06T03-24-05.759125.parquet"]}]}]} | 2024-01-06T03:26:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of UCLA-AGI/test-test
Dataset automatically created during the evaluation run of model UCLA-AGI/test-test on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-06T03:24:05.759125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of UCLA-AGI/test-test\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-06T03:24:05.759125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of UCLA-AGI/test-test\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-06T03:24:05.759125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
179,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of UCLA-AGI/test-test\n\n\n\nDataset automatically created during the evaluation run of model UCLA-AGI/test-test on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2024-01-06T03:24:05.759125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
3c5fa1c61c68f77b75d615b38f0cd3edf3b889fe |
# D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis
This is an unofficial HuggingFace upload of the D2A dataset from "[D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis](https://arxiv.org/abs/2102.07995)". "Test" splits have all labels as -1 as they are not provided.
Usage:
```python
from datasets import load_dataset
# Use "code", "code_trace", "function", or "trace" to load the different variants.
dataset = load_dataset("claudios/D2A", "code")
```
***
# D2A Leaderboard Data
This document describes D2A V1 Leaderboard data. You can download them from the Leaderboard section of the [D2A Dataset](https://dax-cdn.cdn.appdomain.cloud/dax-d2a/1.1.0/d2a.html?cm_mc_uid=52096571630515722723826&cm_mc_sid_50200000=65851751618339788874&_ga=2.42786284.851757668.1618339789-1229357178.1617837310) page. To begin download directly you can click [here](https://dax-cdn.cdn.appdomain.cloud/dax-d2a/1.1.0/d2a_leaderboard_data.tar.gz).
## Source files:
The files were created using the [default security errors](#default-security-errors) of datasets Libav, OpenSSL, Nginx, Httpd and Libtiff from [D2A](https://developer.ibm.com/exchanges/data/all/d2a/).
There are 4 directories corresponding to 4 tasks of the leaderboard. Each directory contains 3 csv files corresponding to the train (80%), dev (10%) and test (10%) split.
The columns in the split files are identical except the test split which does not contain the label column.
## Columns:
1. **id**: A unique id for every example in a task.
2. **label**: Values are 0 or 1.
1. Value 0: No vulnerability/defect in the example.
2. Value 1: Example contains some vulnerability/defect.
3. **trace**: Bug trace or bug report generated by Infer static analyzer. Infer predictions contain a lot of False positives which is why even 0 label examples have a bug report.
4. **bug_function/code**: Full source code of the function where the vulnerability originates.
5. **bug_url**: URL of the file which contains the bug_function.
6. **functions**: Full source code of all the functions in the bug trace, with the duplicates removed. This will include the function in bug_function.
## Default Security Errors:
These are security errors enabled by default by Infer.
* BIABD_USE_AFTER_FREE
* BUFFER_OVERRUN_L1
* BUFFER_OVERRUN_L2
* BUFFER_OVERRUN_L3
* BUFFER_OVERRUN_R2
* BUFFER_OVERRUN_S2
* BUFFER_OVERRUN_T1
* INTEGER_OVERFLOW_L1
* INTEGER_OVERFLOW_L2
* INTEGER_OVERFLOW_R2
* MEMORY_LEAK
* NULL_DEREFERENCE
* RESOURCE_LEAK
* LAB_RESOURCE_LEAK
* UNINITIALIZED_VALUE
* USE_AFTER_DELETE
* USE_AFTER_FREE
* USE_AFTER_LIFETIME
## Data Examples:
1. Trace:
```"test/bntest.c:1802: error: BUFFER_OVERRUN_L3
Offset: [4, +oo] (‚áê [0, +oo] + 4) Size: [0, 8388607] by call to `BN_mul`.
Showing all 12 steps of the trace
test/bntest.c:1798:10: Call
1796.
1797. /* Test that BN_mul never gives negative zero. */
1798. if (!BN_set_word(a, 1))
^
1799. goto err;
1800. BN_set_negative(a, 1);
crypto/bn/bn_lib.c:463:1: Parameter `*a->d`
461. }
462.
463. > int BN_set_word(BIGNUM *a, BN_ULONG w)
464. {
465. bn_check_top(a);
crypto/bn/bn_lib.c:466:9: Call
464. {
465. bn_check_top(a);
466. if (bn_expand(a, (int)sizeof(BN_ULONG) * 8) == NULL)
^
467. return (0);
468. a->neg = 0;
crypto/bn/bn_lcl.h:676:1: Parameter `*a->d`
674. int bn_probable_prime_dh_coprime(BIGNUM *rnd, int bits, BN_CTX *ctx);
675.
676. > static ossl_inline BIGNUM *bn_expand(BIGNUM *a, int bits)
677. {
678. if (bits > (INT_MAX - BN_BITS2 + 1))
test/bntest.c:1802:10: Call
1800. BN_set_negative(a, 1);
1801. BN_zero(b);
1802. if (!BN_mul(c, a, b, ctx))
^
1803. goto err;
1804. if (!BN_is_zero(c) || BN_is_negative(c)) {
crypto/bn/bn_mul.c:828:1: Parameter `*b->d`
826. #endif /* BN_RECURSION */
827.
828. > int BN_mul(BIGNUM *r, const BIGNUM *a, const BIGNUM *b, BN_CTX *ctx)
829. {
830. int ret = 0;
crypto/bn/bn_mul.c:909:17: Call
907. if (bn_wexpand(rr, k * 4) == NULL)
908. goto err;
909. bn_mul_part_recursive(rr->d, a->d, b->d,
^
910. j, al - j, bl - j, t->d);
911. } else { /* al <= j || bl <= j */
crypto/bn/bn_mul.c:480:1: Parameter `*b`
478. */
479. /* tnX may not be negative but less than n */
480. > void bn_mul_part_recursive(BN_ULONG *r, BN_ULONG *a, BN_ULONG *b, int n,
481. int tna, int tnb, BN_ULONG *t)
482. {
crypto/bn/bn_mul.c:488:9: Call
486.
487. if (n < 8) {
488. bn_mul_normal(r, a, n + tna, b, n + tnb);
^
489. return;
490. }
crypto/bn/bn_mul.c:983:1: <Length trace>
981. }
982.
983. > void bn_mul_normal(BN_ULONG *r, BN_ULONG *a, int na, BN_ULONG *b, int nb)
984. {
985. BN_ULONG *rr;
crypto/bn/bn_mul.c:983:1: Parameter `*b`
981. }
982.
983. > void bn_mul_normal(BN_ULONG *r, BN_ULONG *a, int na, BN_ULONG *b, int nb)
984. {
985. BN_ULONG *rr;
crypto/bn/bn_mul.c:1018:50: Array access: Offset: [4, +oo] (‚áê [0, +oo] + 4) Size: [0, 8388607] by call to `BN_mul`
1016. if (--nb <= 0)
1017. return;
1018. rr[4] = bn_mul_add_words(&(r[4]), a, na, b[4]);
^
1019. rr += 4;
1020. r += 4;
"
```
2. Bug URL:
```
https://github.com/openssl/openssl/blob/0282aeb690d63fab73a07191b63300a2fe30d212/crypto/bn/bn_mul.c/#L1018
```
3. Bug Function:
```
"void bn_mul_normal(BN_ULONG *r, BN_ULONG *a, int na, BN_ULONG *b, int nb)
{
BN_ULONG *rr;
if (na < nb) {
int itmp;
BN_ULONG *ltmp;
itmp = na;
na = nb;
nb = itmp;
ltmp = a;
a = b;
b = ltmp;
}
rr = &(r[na]);
if (nb <= 0) {
(void)bn_mul_words(r, a, na, 0);
return;
} else
rr[0] = bn_mul_words(r, a, na, b[0]);
for (;;) {
if (--nb <= 0)
return;
rr[1] = bn_mul_add_words(&(r[1]), a, na, b[1]);
if (--nb <= 0)
return;
rr[2] = bn_mul_add_words(&(r[2]), a, na, b[2]);
if (--nb <= 0)
return;
rr[3] = bn_mul_add_words(&(r[3]), a, na, b[3]);
if (--nb <= 0)
return;
rr[4] = bn_mul_add_words(&(r[4]), a, na, b[4]);
rr += 4;
r += 4;
b += 4;
}
}"
```
4. Functions:
```
[
'static int test_negzero() {
BIGNUM * a = BN_new();
BIGNUM * b = BN_new();
BIGNUM * c = BN_new();
BIGNUM * d = BN_new();
BIGNUM * numerator = NULL, * denominator = NULL;
int consttime, st = 0;
if (a == NULL || b == NULL || c == NULL || d == NULL) goto err;
if (!BN_set_word(a, 1)) goto err;
BN_set_negative(a, 1);
BN_zero(b);
if (!BN_mul(c, a, b, ctx)) goto err;
if (!BN_is_zero(c) || BN_is_negative(c)) {
fprintf(stderr, "Multiplication test failed!");
goto err;
}
for (consttime = 0; consttime < 2; consttime++) {
numerator = BN_new();
denominator = BN_new();
if (numerator == NULL || denominator == NULL) goto err;
if (consttime) {
BN_set_flags(numerator, BN_FLG_CONSTTIME);
BN_set_flags(denominator, BN_FLG_CONSTTIME);
}
if (!BN_set_word(numerator, 1) || !BN_set_word(denominator, 2)) goto err;
BN_set_negative(numerator, 1);
if (!BN_div(a, b, numerator, denominator, ctx)) goto err;
if (!BN_is_zero(a) || BN_is_negative(a)) {
fprintf(stderr, "Incorrect quotient (consttime = %d).", consttime);
goto err;
}
if (!BN_set_word(denominator, 1)) goto err;
if (!BN_div(a, b, numerator, denominator, ctx)) goto err;
if (!BN_is_zero(b) || BN_is_negative(b)) {
fprintf(stderr, "Incorrect remainder (consttime = %d).", consttime);
goto err;
}
BN_free(numerator);
BN_free(denominator);
numerator = denominator = NULL;
}
BN_zero(a);
BN_set_negative(a, 1);
if (BN_is_negative(a)) {
fprintf(stderr, "BN_set_negative produced a negative zero.");
goto err;
}
st = 1;
err: BN_free(a);
BN_free(b);
BN_free(c);
BN_free(d);
BN_free(numerator);
BN_free(denominator);
return st;
}',
'int BN_set_word(BIGNUM * a, BN_ULONG w) {
bn_check_top(a);
if (bn_expand(a, (int) sizeof(BN_ULONG) * 8) == NULL) return (0);
a -> neg = 0;
a -> d[0] = w;
a -> top = (w ? 1 : 0);
bn_check_top(a);
return (1);
}',
'static ossl_inline BIGNUM * bn_expand(BIGNUM * a, int bits) {
if (bits > (INT_MAX - BN_BITS2 + 1)) return NULL;
if (((bits + BN_BITS2 - 1) / BN_BITS2) <= (a) -> dmax) return a;
return bn_expand2((a), (bits + BN_BITS2 - 1) / BN_BITS2);
}',
'int BN_mul(BIGNUM * r,
const BIGNUM * a,
const BIGNUM * b, BN_CTX * ctx) {
int ret = 0;
int top, al, bl;
BIGNUM * rr;
#if defined(BN_MUL_COMBA) || defined(BN_RECURSION) int i;
#endif #ifdef BN_RECURSION BIGNUM * t = NULL;
int j = 0, k;
#endif bn_check_top(a);
bn_check_top(b);
bn_check_top(r);
al = a -> top;
bl = b -> top;
if ((al == 0) || (bl == 0)) {
BN_zero(r);
return (1);
}
top = al + bl;
BN_CTX_start(ctx);
if ((r == a) || (r == b)) {
if ((rr = BN_CTX_get(ctx)) == NULL) goto err;
} else rr = r;
rr -> neg = a -> neg ^ b -> neg;
#if defined(BN_MUL_COMBA) || defined(BN_RECURSION) i = al - bl;
#endif #ifdef BN_MUL_COMBA
if (i == 0) {
#
if 0
if (al == 4) {
if (bn_wexpand(rr, 8) == NULL) goto err;
rr -> top = 8;
bn_mul_comba4(rr -> d, a -> d, b -> d);
goto end;
}
# endif
if (al == 8) {
if (bn_wexpand(rr, 16) == NULL) goto err;
rr -> top = 16;
bn_mul_comba8(rr -> d, a -> d, b -> d);
goto end;
}
}
#endif #ifdef BN_RECURSION
if ((al >= BN_MULL_SIZE_NORMAL) && (bl >= BN_MULL_SIZE_NORMAL)) {
if (i >= -1 && i <= 1) {
if (i >= 0) {
j = BN_num_bits_word((BN_ULONG) al);
}
if (i == -1) {
j = BN_num_bits_word((BN_ULONG) bl);
}
j = 1 << (j - 1);
assert(j <= al || j <= bl);
k = j + j;
t = BN_CTX_get(ctx);
if (t == NULL) goto err;
if (al > j || bl > j) {
if (bn_wexpand(t, k * 4) == NULL) goto err;
if (bn_wexpand(rr, k * 4) == NULL) goto err;
bn_mul_part_recursive(rr -> d, a -> d, b -> d, j, al - j, bl - j, t -> d);
} else {
if (bn_wexpand(t, k * 2) == NULL) goto err;
if (bn_wexpand(rr, k * 2) == NULL) goto err;
bn_mul_recursive(rr -> d, a -> d, b -> d, j, al - j, bl - j, t -> d);
}
rr -> top = top;
goto end;
}
#
if 0
if (i == 1 && !BN_get_flags(b, BN_FLG_STATIC_DATA)) {
BIGNUM * tmp_bn = (BIGNUM * ) b;
if (bn_wexpand(tmp_bn, al) == NULL) goto err;
tmp_bn -> d[bl] = 0;
bl++;
i--;
} else if (i == -1 && !BN_get_flags(a, BN_FLG_STATIC_DATA)) {
BIGNUM * tmp_bn = (BIGNUM * ) a;
if (bn_wexpand(tmp_bn, bl) == NULL) goto err;
tmp_bn -> d[al] = 0;
al++;
i++;
}
if (i == 0) {
j = BN_num_bits_word((BN_ULONG) al);
j = 1 << (j - 1);
k = j + j;
t = BN_CTX_get(ctx);
if (al == j) {
if (bn_wexpand(t, k * 2) == NULL) goto err;
if (bn_wexpand(rr, k * 2) == NULL) goto err;
bn_mul_recursive(rr -> d, a -> d, b -> d, al, t -> d);
} else {
if (bn_wexpand(t, k * 4) == NULL) goto err;
if (bn_wexpand(rr, k * 4) == NULL) goto err;
bn_mul_part_recursive(rr -> d, a -> d, b -> d, al - j, j, t -> d);
}
rr -> top = top;
goto end;
}
# endif
}
#endif
if (bn_wexpand(rr, top) == NULL) goto err;
rr -> top = top;
bn_mul_normal(rr -> d, a -> d, al, b -> d, bl);
#if defined(BN_MUL_COMBA) || defined(BN_RECURSION) end: #endif bn_correct_top(rr);
if (r != rr && BN_copy(r, rr) == NULL) goto err;
ret = 1;
err: bn_check_top(r);
BN_CTX_end(ctx);
return (ret);
}',
'void bn_mul_part_recursive(BN_ULONG * r, BN_ULONG * a, BN_ULONG * b, int n, int tna, int tnb, BN_ULONG * t) {
int i, j, n2 = n * 2;
int c1, c2, neg;
BN_ULONG ln, lo, * p;
if (n < 8) {
bn_mul_normal(r, a, n + tna, b, n + tnb);
return;
}
c1 = bn_cmp_part_words(a, & (a[n]), tna, n - tna);
c2 = bn_cmp_part_words( & (b[n]), b, tnb, tnb - n);
neg = 0;
switch (c1 * 3 + c2) {
case -4:
bn_sub_part_words(t, & (a[n]), a, tna, tna - n);
bn_sub_part_words( & (t[n]), b, & (b[n]), tnb, n - tnb);
break;
case -3:
case -2:
bn_sub_part_words(t, & (a[n]), a, tna, tna - n);
bn_sub_part_words( & (t[n]), & (b[n]), b, tnb, tnb - n);
neg = 1;
break;
case -1:
case 0:
case 1:
case 2:
bn_sub_part_words(t, a, & (a[n]), tna, n - tna);
bn_sub_part_words( & (t[n]), b, & (b[n]), tnb, n - tnb);
neg = 1;
break;
case 3:
case 4:
bn_sub_part_words(t, a, & (a[n]), tna, n - tna);
bn_sub_part_words( & (t[n]), & (b[n]), b, tnb, tnb - n);
break;
}
#
if 0
if (n == 4) {
bn_mul_comba4( & (t[n2]), t, & (t[n]));
bn_mul_comba4(r, a, b);
bn_mul_normal( & (r[n2]), & (a[n]), tn, & (b[n]), tn);
memset( & r[n2 + tn * 2], 0, sizeof( * r) * (n2 - tn * 2));
} else # endif
if (n == 8) {
bn_mul_comba8( & (t[n2]), t, & (t[n]));
bn_mul_comba8(r, a, b);
bn_mul_normal( & (r[n2]), & (a[n]), tna, & (b[n]), tnb);
memset( & r[n2 + tna + tnb], 0, sizeof( * r) * (n2 - tna - tnb));
} else {
p = & (t[n2 * 2]);
bn_mul_recursive( & (t[n2]), t, & (t[n]), n, 0, 0, p);
bn_mul_recursive(r, a, b, n, 0, 0, p);
i = n / 2;
if (tna > tnb) j = tna - i;
else j = tnb - i;
if (j == 0) {
bn_mul_recursive( & (r[n2]), & (a[n]), & (b[n]), i, tna - i, tnb - i, p);
memset( & r[n2 + i * 2], 0, sizeof( * r) * (n2 - i * 2));
} else if (j > 0) {
bn_mul_part_recursive( & (r[n2]), & (a[n]), & (b[n]), i, tna - i, tnb - i, p);
memset( & (r[n2 + tna + tnb]), 0, sizeof(BN_ULONG) * (n2 - tna - tnb));
} else {
memset( & r[n2], 0, sizeof( * r) * n2);
if (tna < BN_MUL_RECURSIVE_SIZE_NORMAL && tnb < BN_MUL_RECURSIVE_SIZE_NORMAL) {
bn_mul_normal( & (r[n2]), & (a[n]), tna, & (b[n]), tnb);
} else {
for (;;) {
i /= 2;
if (i < tna || i < tnb) {
bn_mul_part_recursive( & (r[n2]), & (a[n]), & (b[n]), i, tna - i, tnb - i, p);
break;
} else if (i == tna || i == tnb) {
bn_mul_recursive( & (r[n2]), & (a[n]), & (b[n]), i, tna - i, tnb - i, p);
break;
}
}
}
}
}
c1 = (int)(bn_add_words(t, r, & (r[n2]), n2));
if (neg) {
c1 -= (int)(bn_sub_words( & (t[n2]), t, & (t[n2]), n2));
} else {
c1 += (int)(bn_add_words( & (t[n2]), & (t[n2]), t, n2));
}
c1 += (int)(bn_add_words( & (r[n]), & (r[n]), & (t[n2]), n2));
if (c1) {
p = & (r[n + n2]);
lo = * p;
ln = (lo + c1) & BN_MASK2;* p = ln;
if (ln < (BN_ULONG) c1) {
do {
p++;
lo = * p;
ln = (lo + 1) & BN_MASK2;* p = ln;
} while (ln == 0);
}
}
}',
'void bn_mul_normal(BN_ULONG * r, BN_ULONG * a, int na, BN_ULONG * b, int nb) {
BN_ULONG * rr;
if (na < nb) {
int itmp;
BN_ULONG * ltmp;
itmp = na;
na = nb;
nb = itmp;
ltmp = a;
a = b;
b = ltmp;
}
rr = & (r[na]);
if (nb <= 0) {
(void) bn_mul_words(r, a, na, 0);
return;
} else rr[0] = bn_mul_words(r, a, na, b[0]);
for (;;) {
if (--nb <= 0) return;
rr[1] = bn_mul_add_words( & (r[1]), a, na, b[1]);
if (--nb <= 0) return;
rr[2] = bn_mul_add_words( & (r[2]), a, na, b[2]);
if (--nb <= 0) return;
rr[3] = bn_mul_add_words( & (r[3]), a, na, b[3]);
if (--nb <= 0) return;
rr[4] = bn_mul_add_words( & (r[4]), a, na, b[4]);
rr += 4;
r += 4;
b += 4;
}
}'
]
```
[Leaderboard README](https://github.com/IBM/D2A/blob/main/leaderboard/README.md) || [Leaderboard page](https://ibm.github.io/D2A) | claudios/D2A | [
"task_categories:text-classification",
"license:apache-2.0",
"code",
"arxiv:2102.07995",
"region:us"
] | 2024-01-05T21:13:38+00:00 | {"license": "apache-2.0", "task_categories": ["text-classification"], "arxiv": 2102.07995, "dataset_info": [{"config_name": "code", "features": [{"name": "id", "dtype": "int64"}, {"name": "label", "dtype": "int64"}, {"name": "bug_url", "dtype": "string"}, {"name": "bug_function", "dtype": "string"}, {"name": "functions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 357876131, "num_examples": 36719}, {"name": "dev", "num_bytes": 48017743, "num_examples": 4634}, {"name": "test", "num_bytes": 43035964, "num_examples": 4604}], "download_size": 139316551, "dataset_size": 448929838}, {"config_name": "code_trace", "features": [{"name": "id", "dtype": "int64"}, {"name": "label", "dtype": "int64"}, {"name": "trace", "dtype": "string"}, {"name": "bug_url", "dtype": "string"}, {"name": "bug_function", "dtype": "string"}, {"name": "functions", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 531973771, "num_examples": 36719}, {"name": "dev", "num_bytes": 66958385, "num_examples": 4634}, {"name": "test", "num_bytes": 64518442, "num_examples": 4604}], "download_size": 208837991, "dataset_size": 663450598}, {"config_name": "function", "features": [{"name": "id", "dtype": "int64"}, {"name": "label", "dtype": "int64"}, {"name": "code", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8913129, "num_examples": 4643}, {"name": "dev", "num_bytes": 1107843, "num_examples": 596}, {"name": "test", "num_bytes": 1193137, "num_examples": 618}], "download_size": 4715682, "dataset_size": 11214109}, {"config_name": "trace", "features": [{"name": "id", "dtype": "int64"}, {"name": "label", "dtype": "int64"}, {"name": "trace", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 174685144, "num_examples": 36719}, {"name": "dev", "num_bytes": 19014786, "num_examples": 4634}, {"name": "test", "num_bytes": 21556142, "num_examples": 4604}], "download_size": 68014392, "dataset_size": 215256072}], "configs": [{"config_name": "code", "data_files": [{"split": "train", "path": "code/train-*"}, {"split": "dev", "path": "code/dev-*"}, {"split": "test", "path": "code/test-*"}]}, {"config_name": "code_trace", "data_files": [{"split": "train", "path": "code_trace/train-*"}, {"split": "dev", "path": "code_trace/dev-*"}, {"split": "test", "path": "code_trace/test-*"}]}, {"config_name": "function", "data_files": [{"split": "train", "path": "function/train-*"}, {"split": "dev", "path": "function/dev-*"}, {"split": "test", "path": "function/test-*"}]}, {"config_name": "trace", "data_files": [{"split": "train", "path": "trace/train-*"}, {"split": "dev", "path": "trace/dev-*"}, {"split": "test", "path": "trace/test-*"}]}], "tags": ["code"]} | 2024-01-05T23:30:23+00:00 | [
"2102.07995"
] | [] | TAGS
#task_categories-text-classification #license-apache-2.0 #code #arxiv-2102.07995 #region-us
|
# D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis
This is an unofficial HuggingFace upload of the D2A dataset from "D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis". "Test" splits have all labels as -1 as they are not provided.
Usage:
*
# D2A Leaderboard Data
This document describes D2A V1 Leaderboard data. You can download them from the Leaderboard section of the D2A Dataset page. To begin download directly you can click here.
## Source files:
The files were created using the default security errors of datasets Libav, OpenSSL, Nginx, Httpd and Libtiff from D2A.
There are 4 directories corresponding to 4 tasks of the leaderboard. Each directory contains 3 csv files corresponding to the train (80%), dev (10%) and test (10%) split.
The columns in the split files are identical except the test split which does not contain the label column.
## Columns:
1. id: A unique id for every example in a task.
2. label: Values are 0 or 1.
1. Value 0: No vulnerability/defect in the example.
2. Value 1: Example contains some vulnerability/defect.
3. trace: Bug trace or bug report generated by Infer static analyzer. Infer predictions contain a lot of False positives which is why even 0 label examples have a bug report.
4. bug_function/code: Full source code of the function where the vulnerability originates.
5. bug_url: URL of the file which contains the bug_function.
6. functions: Full source code of all the functions in the bug trace, with the duplicates removed. This will include the function in bug_function.
## Default Security Errors:
These are security errors enabled by default by Infer.
* BIABD_USE_AFTER_FREE
* BUFFER_OVERRUN_L1
* BUFFER_OVERRUN_L2
* BUFFER_OVERRUN_L3
* BUFFER_OVERRUN_R2
* BUFFER_OVERRUN_S2
* BUFFER_OVERRUN_T1
* INTEGER_OVERFLOW_L1
* INTEGER_OVERFLOW_L2
* INTEGER_OVERFLOW_R2
* MEMORY_LEAK
* NULL_DEREFERENCE
* RESOURCE_LEAK
* LAB_RESOURCE_LEAK
* UNINITIALIZED_VALUE
* USE_AFTER_DELETE
* USE_AFTER_FREE
* USE_AFTER_LIFETIME
## Data Examples:
1. Trace:
2. Bug URL:
3. Bug Function:
4. Functions:
Leaderboard README || Leaderboard page | [
"# D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\nThis is an unofficial HuggingFace upload of the D2A dataset from \"D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\". \"Test\" splits have all labels as -1 as they are not provided.\n\nUsage:\n\n*",
"# D2A Leaderboard Data\n\nThis document describes D2A V1 Leaderboard data. You can download them from the Leaderboard section of the D2A Dataset page. To begin download directly you can click here.",
"## Source files:\n\nThe files were created using the default security errors of datasets Libav, OpenSSL, Nginx, Httpd and Libtiff from D2A.\n\nThere are 4 directories corresponding to 4 tasks of the leaderboard. Each directory contains 3 csv files corresponding to the train (80%), dev (10%) and test (10%) split. \nThe columns in the split files are identical except the test split which does not contain the label column.",
"## Columns:\n\n1. id: A unique id for every example in a task.\n2. label: Values are 0 or 1.\n\t1. Value 0: No vulnerability/defect in the example.\n\t2. Value 1: Example contains some vulnerability/defect.\n3. trace: Bug trace or bug report generated by Infer static analyzer. Infer predictions contain a lot of False positives which is why even 0 label examples have a bug report. \n4. bug_function/code: Full source code of the function where the vulnerability originates.\n5. bug_url: URL of the file which contains the bug_function.\n6. functions: Full source code of all the functions in the bug trace, with the duplicates removed. This will include the function in bug_function.",
"## Default Security Errors:\n\nThese are security errors enabled by default by Infer.\n\n* BIABD_USE_AFTER_FREE\n* BUFFER_OVERRUN_L1\n* BUFFER_OVERRUN_L2\n* BUFFER_OVERRUN_L3\n* BUFFER_OVERRUN_R2\n* BUFFER_OVERRUN_S2\n* BUFFER_OVERRUN_T1\n* INTEGER_OVERFLOW_L1\n* INTEGER_OVERFLOW_L2\n* INTEGER_OVERFLOW_R2\n* MEMORY_LEAK\n* NULL_DEREFERENCE\n* RESOURCE_LEAK\n* LAB_RESOURCE_LEAK\n* UNINITIALIZED_VALUE\n* USE_AFTER_DELETE\n* USE_AFTER_FREE\n* USE_AFTER_LIFETIME",
"## Data Examples:\n\n1. Trace:\n\n\n\n2. Bug URL:\n\n\n\n3. Bug Function:\n\n\n\n4. Functions:\n\n\n\nLeaderboard README || Leaderboard page"
] | [
"TAGS\n#task_categories-text-classification #license-apache-2.0 #code #arxiv-2102.07995 #region-us \n",
"# D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\nThis is an unofficial HuggingFace upload of the D2A dataset from \"D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\". \"Test\" splits have all labels as -1 as they are not provided.\n\nUsage:\n\n*",
"# D2A Leaderboard Data\n\nThis document describes D2A V1 Leaderboard data. You can download them from the Leaderboard section of the D2A Dataset page. To begin download directly you can click here.",
"## Source files:\n\nThe files were created using the default security errors of datasets Libav, OpenSSL, Nginx, Httpd and Libtiff from D2A.\n\nThere are 4 directories corresponding to 4 tasks of the leaderboard. Each directory contains 3 csv files corresponding to the train (80%), dev (10%) and test (10%) split. \nThe columns in the split files are identical except the test split which does not contain the label column.",
"## Columns:\n\n1. id: A unique id for every example in a task.\n2. label: Values are 0 or 1.\n\t1. Value 0: No vulnerability/defect in the example.\n\t2. Value 1: Example contains some vulnerability/defect.\n3. trace: Bug trace or bug report generated by Infer static analyzer. Infer predictions contain a lot of False positives which is why even 0 label examples have a bug report. \n4. bug_function/code: Full source code of the function where the vulnerability originates.\n5. bug_url: URL of the file which contains the bug_function.\n6. functions: Full source code of all the functions in the bug trace, with the duplicates removed. This will include the function in bug_function.",
"## Default Security Errors:\n\nThese are security errors enabled by default by Infer.\n\n* BIABD_USE_AFTER_FREE\n* BUFFER_OVERRUN_L1\n* BUFFER_OVERRUN_L2\n* BUFFER_OVERRUN_L3\n* BUFFER_OVERRUN_R2\n* BUFFER_OVERRUN_S2\n* BUFFER_OVERRUN_T1\n* INTEGER_OVERFLOW_L1\n* INTEGER_OVERFLOW_L2\n* INTEGER_OVERFLOW_R2\n* MEMORY_LEAK\n* NULL_DEREFERENCE\n* RESOURCE_LEAK\n* LAB_RESOURCE_LEAK\n* UNINITIALIZED_VALUE\n* USE_AFTER_DELETE\n* USE_AFTER_FREE\n* USE_AFTER_LIFETIME",
"## Data Examples:\n\n1. Trace:\n\n\n\n2. Bug URL:\n\n\n\n3. Bug Function:\n\n\n\n4. Functions:\n\n\n\nLeaderboard README || Leaderboard page"
] | [
35,
96,
47,
109,
168,
191,
33
] | [
"passage: TAGS\n#task_categories-text-classification #license-apache-2.0 #code #arxiv-2102.07995 #region-us \n# D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\nThis is an unofficial HuggingFace upload of the D2A dataset from \"D2A: A Dataset Built for AI-Based Vulnerability Detection Methods Using Differential Analysis\". \"Test\" splits have all labels as -1 as they are not provided.\n\nUsage:\n\n*# D2A Leaderboard Data\n\nThis document describes D2A V1 Leaderboard data. You can download them from the Leaderboard section of the D2A Dataset page. To begin download directly you can click here.## Source files:\n\nThe files were created using the default security errors of datasets Libav, OpenSSL, Nginx, Httpd and Libtiff from D2A.\n\nThere are 4 directories corresponding to 4 tasks of the leaderboard. Each directory contains 3 csv files corresponding to the train (80%), dev (10%) and test (10%) split. \nThe columns in the split files are identical except the test split which does not contain the label column.## Columns:\n\n1. id: A unique id for every example in a task.\n2. label: Values are 0 or 1.\n\t1. Value 0: No vulnerability/defect in the example.\n\t2. Value 1: Example contains some vulnerability/defect.\n3. trace: Bug trace or bug report generated by Infer static analyzer. Infer predictions contain a lot of False positives which is why even 0 label examples have a bug report. \n4. bug_function/code: Full source code of the function where the vulnerability originates.\n5. bug_url: URL of the file which contains the bug_function.\n6. functions: Full source code of all the functions in the bug trace, with the duplicates removed. This will include the function in bug_function."
] |
ac5a08c4292dd42be9a5c5f29780258f81333a46 |
This is an unofficial HuggingFace version of "[Draper VDISC Dataset - Vulnerability Detection in Source Code](https://osf.io/d45bw/)" dataset from "[Automated Vulnerability Detection in Source Code Using Deep Representation Learning](https://arxiv.org/abs/1807.04320)".
***
Draper VDISC Dataset - Vulnerability Detection in Source Code
The dataset consists of the source code of 1.27 million functions mined from open source software, labeled by static analysis for potential vulnerabilities. For more details on the dataset and benchmark results, see https://arxiv.org/abs/1807.04320.
The data is provided in three HDF5 files corresponding to an 80:10:10 train/validate/test split, matching the splits used in our paper. The combined file size is roughly 1 GB. Each function's raw source code, starting from the function name, is stored as a variable-length UTF-8 string. Five binary 'vulnerability' labels are provided for each function, corresponding to the four most common CWEs in our data plus all others:
```
CWE-120 (3.7% of functions)
CWE-119 (1.9% of functions)
CWE-469 (0.95% of functions)
CWE-476 (0.21% of functions)
CWE-other (2.7% of functions)
```
Functions may have more than one detected CWE each.
Please cite our paper if you use this dataset in a publication: https://arxiv.org/abs/1807.04320
This project was sponsored by the Air Force Research Laboratory (AFRL) as part of the DARPA MUSE (https://www.darpa.mil/program/mining-and-understanding-software-enclaves) program.
About Draper (https://www.draper.com) - Draper is an independent, not-for-profit corporation, which means its primary commitment is to the success of customers' missions rather than to shareholders. For either government or private sector customers, Draper leverages its deep experience and innovative thinking to be an effective engineering research and development partner, designing solutions or objectively evaluating the ideas or products of others. Draper will partner with other organizations — from large for-profit prime contractors, to government agencies, to university researchers — in a variety of capacities. Services Draper provides range from concept development through delivered solution and lifecycle support. Draper's multidisciplinary teams of engineers and scientists can deliver useful solutions to even the most critical problems. | claudios/Draper | [
"task_categories:text-classification",
"code",
"arxiv:1807.04320",
"region:us"
] | 2024-01-05T21:18:50+00:00 | {"task_categories": ["text-classification"], "arxiv": 1807.0432, "dataset_info": {"features": [{"name": "functionSource", "dtype": "string"}, {"name": "CWE-119", "dtype": "bool"}, {"name": "CWE-120", "dtype": "bool"}, {"name": "CWE-469", "dtype": "bool"}, {"name": "CWE-476", "dtype": "bool"}, {"name": "CWE-other", "dtype": "bool"}, {"name": "combine", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 832092463, "num_examples": 1019471}, {"name": "validation", "num_bytes": 104260416, "num_examples": 127476}, {"name": "test", "num_bytes": 104097361, "num_examples": 127419}], "download_size": 535360739, "dataset_size": 1040450240}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["code"]} | 2024-01-05T22:41:42+00:00 | [
"1807.04320"
] | [] | TAGS
#task_categories-text-classification #code #arxiv-1807.04320 #region-us
|
This is an unofficial HuggingFace version of "Draper VDISC Dataset - Vulnerability Detection in Source Code" dataset from "Automated Vulnerability Detection in Source Code Using Deep Representation Learning".
*
Draper VDISC Dataset - Vulnerability Detection in Source Code
The dataset consists of the source code of 1.27 million functions mined from open source software, labeled by static analysis for potential vulnerabilities. For more details on the dataset and benchmark results, see URL
The data is provided in three HDF5 files corresponding to an 80:10:10 train/validate/test split, matching the splits used in our paper. The combined file size is roughly 1 GB. Each function's raw source code, starting from the function name, is stored as a variable-length UTF-8 string. Five binary 'vulnerability' labels are provided for each function, corresponding to the four most common CWEs in our data plus all others:
Functions may have more than one detected CWE each.
Please cite our paper if you use this dataset in a publication: URL
This project was sponsored by the Air Force Research Laboratory (AFRL) as part of the DARPA MUSE (URL program.
About Draper (URL) - Draper is an independent, not-for-profit corporation, which means its primary commitment is to the success of customers' missions rather than to shareholders. For either government or private sector customers, Draper leverages its deep experience and innovative thinking to be an effective engineering research and development partner, designing solutions or objectively evaluating the ideas or products of others. Draper will partner with other organizations — from large for-profit prime contractors, to government agencies, to university researchers — in a variety of capacities. Services Draper provides range from concept development through delivered solution and lifecycle support. Draper's multidisciplinary teams of engineers and scientists can deliver useful solutions to even the most critical problems. | [] | [
"TAGS\n#task_categories-text-classification #code #arxiv-1807.04320 #region-us \n"
] | [
27
] | [
"passage: TAGS\n#task_categories-text-classification #code #arxiv-1807.04320 #region-us \n"
] |
9ff2cdc0e621f2873eb81fd72cfd7e93c0364409 |
This is an unofficial HuggingFace version of "[VulDeePecker: A Deep Learning-Based System for Vulnerability Detection
](https://arxiv.org/abs/1801.01681)" MVD dataset. See the [source files](https://github.com/muVulDeePecker/muVulDeePecker/tree/master/source%20files) for the relevant source code referred to by the path column.
There are 41 possible classes:
```
{
0: 'non-vulnerable',
1: 'CWE-404',
2: 'CWE-476',
3: 'CWE-119',
4: 'CWE-706',
5: 'CWE-670',
6: 'CWE-673',
7: 'CWE-119, CWE-666, CWE-573',
8: 'CWE-573',
9: 'CWE-668',
10: 'CWE-400, CWE-665, CWE-020',
11: 'CWE-662',
12: 'CWE-400',
13: 'CWE-665',
14: 'CWE-020',
15: 'CWE-074',
16: 'CWE-362',
17: 'CWE-191',
18: 'CWE-190',
19: 'CWE-610',
20: 'CWE-704',
21: 'CWE-170',
22: 'CWE-676',
23: 'CWE-187',
24: 'CWE-138',
25: 'CWE-369',
26: 'CWE-662, CWE-573',
27: 'CWE-834',
28: 'CWE-400, CWE-665',
29: 'CWE-400, CWE-404',
30: 'CWE-221',
31: 'CWE-754',
32: 'CWE-311',
33: 'CWE-404, CWE-668',
34: 'CWE-506',
35: 'CWE-758',
36: 'CWE-666',
37: 'CWE-467',
38: 'CWE-327',
39: 'CWE-666, CWE-573',
40: 'CWE-469'
}
```
***
# Multiclass Vulnerability Dataset (MVD)
MVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.
In this repository, the compressed file mvd.txt.zip stores 181641 code gadgets and their corresponding labels. The file named label2CWE.txt records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets. | claudios/MVD | [
"task_categories:text-classification",
"code",
"arxiv:1801.01681",
"region:us"
] | 2024-01-05T21:36:51+00:00 | {"task_categories": ["text-classification"], "arxiv": 1801.01681, "dataset_info": {"features": [{"name": "func", "dtype": "string"}, {"name": "path", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 156793256, "num_examples": 123515}, {"name": "validation", "num_bytes": 27720814, "num_examples": 21797}, {"name": "test", "num_bytes": 45934658, "num_examples": 36329}], "download_size": 69412844, "dataset_size": 230448728}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["code"]} | 2024-01-05T22:43:28+00:00 | [
"1801.01681"
] | [] | TAGS
#task_categories-text-classification #code #arxiv-1801.01681 #region-us
|
This is an unofficial HuggingFace version of "VulDeePecker: A Deep Learning-Based System for Vulnerability Detection
" MVD dataset. See the source files for the relevant source code referred to by the path column.
There are 41 possible classes:
*
# Multiclass Vulnerability Dataset (MVD)
MVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.
In this repository, the compressed file URL stores 181641 code gadgets and their corresponding labels. The file named URL records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets. | [
"# Multiclass Vulnerability Dataset (MVD) \nMVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.\n\nIn this repository, the compressed file URL stores 181641 code gadgets and their corresponding labels. The file named URL records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets."
] | [
"TAGS\n#task_categories-text-classification #code #arxiv-1801.01681 #region-us \n",
"# Multiclass Vulnerability Dataset (MVD) \nMVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.\n\nIn this repository, the compressed file URL stores 181641 code gadgets and their corresponding labels. The file named URL records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets."
] | [
28,
232
] | [
"passage: TAGS\n#task_categories-text-classification #code #arxiv-1801.01681 #region-us \n# Multiclass Vulnerability Dataset (MVD) \nMVD is a database for research on multiclass vulnerability detection with deep learning. The dataset is based on the NIST Software Assurance Reference Dataset (SARD) and National Vulnerability Database (NVD). Up to now, it has possessed 181641 code gadgets, covering 40 types of vulnerabilities. Each code gadget in MVD is composed of multiple program statements, which have direct or indirect data-dependence and control-dependence relationships with the library/API function calls. In total, the code gadgets in MVD are extracted from 33409 testcases of SARD and NVD, 138522 code gadgets of which are non-vulnerable and 43119 are vulnerable.\n\nIn this repository, the compressed file URL stores 181641 code gadgets and their corresponding labels. The file named URL records the mapping relationship between each label and the corresponding vulnerability. The folder source files contains 33,409 source files for extracting code gadgets."
] |
71f62e1e6e2053173e7de31a342d2c38b3fe2dcc |
This is an unofficial HuggingFace version of "ReVeal" dataset from "[Deep Learning based Vulnerability Detection: Are We There Yet?
](https://arxiv.org/abs/2009.07235)". | claudios/ReVeal | [
"task_categories:text-classification",
"code",
"arxiv:2009.07235",
"region:us"
] | 2024-01-05T21:37:19+00:00 | {"task_categories": ["text-classification"], "arxiv": 2009.07235, "dataset_info": {"features": [{"name": "hash", "dtype": "int64"}, {"name": "project", "dtype": "string"}, {"name": "size", "dtype": "int64"}, {"name": "label", "dtype": "int64"}, {"name": "functionSource", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25678896, "num_examples": 18187}, {"name": "validation", "num_bytes": 2982883, "num_examples": 2273}, {"name": "test", "num_bytes": 3489257, "num_examples": 2274}], "download_size": 12036614, "dataset_size": 32151036}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["code"]} | 2024-01-05T22:31:00+00:00 | [
"2009.07235"
] | [] | TAGS
#task_categories-text-classification #code #arxiv-2009.07235 #region-us
|
This is an unofficial HuggingFace version of "ReVeal" dataset from "Deep Learning based Vulnerability Detection: Are We There Yet?
". | [] | [
"TAGS\n#task_categories-text-classification #code #arxiv-2009.07235 #region-us \n"
] | [
27
] | [
"passage: TAGS\n#task_categories-text-classification #code #arxiv-2009.07235 #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.